Mar 08 05:26:11 crc systemd[1]: Starting Kubernetes Kubelet... Mar 08 05:26:12 crc restorecon[4679]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 05:26:12 crc restorecon[4679]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 05:26:13 crc kubenswrapper[4717]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.512429 4717 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.518928 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.518975 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.518985 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.518994 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519006 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519017 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519026 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519036 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519044 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519052 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519060 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519068 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519077 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519084 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519092 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519101 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519112 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519122 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519131 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519139 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519147 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519154 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519162 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519170 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519198 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519206 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519213 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519221 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519229 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519236 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519244 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519252 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519260 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519268 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519276 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519286 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519294 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519302 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519310 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519317 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519325 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519333 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519340 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519348 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519356 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519363 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519371 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519379 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519387 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519395 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519405 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519413 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519421 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519428 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519436 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519444 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519451 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519459 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519466 4717 feature_gate.go:330] unrecognized feature gate: Example Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519474 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519482 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519492 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519502 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519511 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519521 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519529 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519537 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519545 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519555 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519563 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.519573 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520482 4717 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520504 4717 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520521 4717 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520533 4717 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520546 4717 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520555 4717 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520566 4717 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520577 4717 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520586 4717 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520596 4717 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520606 4717 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520616 4717 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520626 4717 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520635 4717 flags.go:64] FLAG: --cgroup-root="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520644 4717 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520653 4717 flags.go:64] FLAG: --client-ca-file="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520662 4717 flags.go:64] FLAG: --cloud-config="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520670 4717 flags.go:64] FLAG: --cloud-provider="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520679 4717 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520697 4717 flags.go:64] FLAG: --cluster-domain="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520729 4717 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520738 4717 flags.go:64] FLAG: --config-dir="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520747 4717 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520757 4717 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520768 4717 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520778 4717 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520787 4717 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520797 4717 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520806 4717 flags.go:64] FLAG: --contention-profiling="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520815 4717 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520826 4717 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520836 4717 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520845 4717 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520856 4717 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520865 4717 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520874 4717 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520884 4717 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520894 4717 flags.go:64] FLAG: --enable-server="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520903 4717 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520915 4717 flags.go:64] FLAG: --event-burst="100" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520925 4717 flags.go:64] FLAG: --event-qps="50" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520935 4717 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520944 4717 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520955 4717 flags.go:64] FLAG: --eviction-hard="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520966 4717 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520976 4717 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520985 4717 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.520994 4717 flags.go:64] FLAG: --eviction-soft="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521003 4717 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521012 4717 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521021 4717 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521031 4717 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521040 4717 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521049 4717 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521058 4717 flags.go:64] FLAG: --feature-gates="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521069 4717 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521078 4717 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521089 4717 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521099 4717 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521108 4717 flags.go:64] FLAG: --healthz-port="10248" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521117 4717 flags.go:64] FLAG: --help="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521127 4717 flags.go:64] FLAG: --hostname-override="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521135 4717 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521144 4717 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521154 4717 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521163 4717 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521172 4717 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521181 4717 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521190 4717 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521198 4717 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521207 4717 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521217 4717 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521226 4717 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521236 4717 flags.go:64] FLAG: --kube-reserved="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521245 4717 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521254 4717 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521263 4717 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521272 4717 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521282 4717 flags.go:64] FLAG: --lock-file="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521290 4717 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521299 4717 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521308 4717 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521321 4717 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521330 4717 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521339 4717 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521348 4717 flags.go:64] FLAG: --logging-format="text" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521356 4717 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521366 4717 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521375 4717 flags.go:64] FLAG: --manifest-url="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521385 4717 flags.go:64] FLAG: --manifest-url-header="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521396 4717 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521405 4717 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521416 4717 flags.go:64] FLAG: --max-pods="110" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521425 4717 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521434 4717 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521443 4717 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521452 4717 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521461 4717 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521471 4717 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521480 4717 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521499 4717 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521508 4717 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521517 4717 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521527 4717 flags.go:64] FLAG: --pod-cidr="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521536 4717 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521550 4717 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521559 4717 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521569 4717 flags.go:64] FLAG: --pods-per-core="0" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521578 4717 flags.go:64] FLAG: --port="10250" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521588 4717 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521597 4717 flags.go:64] FLAG: --provider-id="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521606 4717 flags.go:64] FLAG: --qos-reserved="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521615 4717 flags.go:64] FLAG: --read-only-port="10255" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521624 4717 flags.go:64] FLAG: --register-node="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521633 4717 flags.go:64] FLAG: --register-schedulable="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521641 4717 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521656 4717 flags.go:64] FLAG: --registry-burst="10" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521664 4717 flags.go:64] FLAG: --registry-qps="5" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521674 4717 flags.go:64] FLAG: --reserved-cpus="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521687 4717 flags.go:64] FLAG: --reserved-memory="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521721 4717 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521730 4717 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521740 4717 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521749 4717 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521758 4717 flags.go:64] FLAG: --runonce="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521767 4717 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521776 4717 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521785 4717 flags.go:64] FLAG: --seccomp-default="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521794 4717 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521803 4717 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521812 4717 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521822 4717 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521831 4717 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521840 4717 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521850 4717 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521859 4717 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521868 4717 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521877 4717 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521886 4717 flags.go:64] FLAG: --system-cgroups="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521896 4717 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521910 4717 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521919 4717 flags.go:64] FLAG: --tls-cert-file="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521927 4717 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521937 4717 flags.go:64] FLAG: --tls-min-version="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521946 4717 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521956 4717 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521965 4717 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521974 4717 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521983 4717 flags.go:64] FLAG: --v="2" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.521995 4717 flags.go:64] FLAG: --version="false" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.522006 4717 flags.go:64] FLAG: --vmodule="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.522016 4717 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.522025 4717 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522325 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522337 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522346 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522355 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522363 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522372 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522382 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522392 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522400 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522408 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522418 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522427 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522437 4717 feature_gate.go:330] unrecognized feature gate: Example Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522445 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522452 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522461 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522468 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522476 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522484 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522493 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522501 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522508 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522516 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522524 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522531 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522539 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522546 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522554 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522570 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522578 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522586 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522593 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522613 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522623 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522632 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522641 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522649 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522657 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522666 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522673 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522682 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522695 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522731 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522739 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522748 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522756 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522764 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522773 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522781 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522789 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522796 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522804 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522811 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522819 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522827 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522835 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522842 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522850 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522858 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522866 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522874 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522882 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522890 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522898 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522912 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522920 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522931 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522941 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522951 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522960 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.522969 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.522981 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.538416 4717 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.539029 4717 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539200 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539230 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539241 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539255 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539267 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539278 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539288 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539299 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539309 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539321 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539331 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539341 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539351 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539361 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539371 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539381 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539390 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539399 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539408 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539419 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539429 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539439 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539450 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539462 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539473 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539483 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539494 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539508 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539522 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539533 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539543 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539557 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539570 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539582 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539595 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539606 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539616 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539626 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539640 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539656 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539668 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539679 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539697 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539735 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539748 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539760 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539770 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539780 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539791 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539802 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539812 4717 feature_gate.go:330] unrecognized feature gate: Example Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539821 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539832 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539842 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539852 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539864 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539875 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539885 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539895 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539906 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539915 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539926 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539936 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539946 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539956 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539966 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539976 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539986 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.539996 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540006 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540022 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.540041 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540323 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540341 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540353 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540364 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540375 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540385 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540395 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540405 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540416 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540428 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540438 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540448 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540459 4717 feature_gate.go:330] unrecognized feature gate: Example Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540468 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540479 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540493 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540503 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540513 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540524 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540533 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540543 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540556 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540570 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540581 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540592 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540602 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540613 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540624 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540634 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540644 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540655 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540665 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540676 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540693 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540733 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540744 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540754 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540764 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540775 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540785 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540796 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540810 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540821 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540832 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540842 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540853 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540864 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540875 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540885 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540894 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540906 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540916 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540926 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540935 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540949 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540962 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540975 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.540989 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541002 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541012 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541023 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541035 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541045 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541057 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541069 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541080 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541092 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541103 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541115 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541125 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.541137 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.541154 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.542594 4717 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.548876 4717 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.555767 4717 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.555935 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.557972 4717 server.go:997] "Starting client certificate rotation" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.558020 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.558255 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.582586 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.587908 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.591160 4717 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.610192 4717 log.go:25] "Validated CRI v1 runtime API" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.651363 4717 log.go:25] "Validated CRI v1 image API" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.653581 4717 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.660166 4717 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-08-05-21-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.660216 4717 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.678546 4717 manager.go:217] Machine: {Timestamp:2026-03-08 05:26:13.675936228 +0000 UTC m=+0.593585092 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:19975744-be96-4ad0-8b81-d51bfb4105e2 BootID:7d163e68-b679-4e39-be4b-f06654c828fa Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a2:cc:15 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a2:cc:15 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d5:75:ae Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:86:73:61 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:e5:77:e3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c8:e1:5c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:66:44:33:fb:19:e4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:f8:51:10:0b:51 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.678776 4717 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.679014 4717 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680072 4717 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680261 4717 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680294 4717 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680473 4717 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680482 4717 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.680993 4717 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.681047 4717 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.681246 4717 state_mem.go:36] "Initialized new in-memory state store" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.681329 4717 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.684634 4717 kubelet.go:418] "Attempting to sync node with API server" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.684655 4717 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.684678 4717 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.684696 4717 kubelet.go:324] "Adding apiserver pod source" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.684722 4717 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.688245 4717 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.689371 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.694255 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.694921 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.694519 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.695029 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.694526 4717 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696796 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696847 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696863 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696878 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696901 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696972 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.696986 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.697013 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.697599 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.697673 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.697759 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.697788 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.698932 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.699788 4717 server.go:1280] "Started kubelet" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.700491 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.701112 4717 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.701126 4717 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 05:26:13 crc systemd[1]: Started Kubernetes Kubelet. Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.702384 4717 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.704509 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.704610 4717 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.704957 4717 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.704986 4717 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.706064 4717 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.706499 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.706916 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.706968 4717 server.go:460] "Adding debug handlers to kubelet server" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.706958 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.707891 4717 factory.go:55] Registering systemd factory Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.707923 4717 factory.go:221] Registration of the systemd container factory successfully Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.708481 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.708754 4717 factory.go:153] Registering CRI-O factory Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.708780 4717 factory.go:221] Registration of the crio container factory successfully Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.708905 4717 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.708956 4717 factory.go:103] Registering Raw factory Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.708994 4717 manager.go:1196] Started watching for new ooms in manager Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.711237 4717 manager.go:319] Starting recovery of all containers Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.709929 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ac66cbfc374df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,LastTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.725335 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.725611 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.725781 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.725966 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726107 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726228 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726353 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726485 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726628 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726802 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.726935 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727059 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727231 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727361 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727481 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727601 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727760 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.727904 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.728028 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.728213 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.728334 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.728450 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.730846 4717 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731040 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731170 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731317 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731443 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731601 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731776 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.731908 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732048 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732178 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732299 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732434 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732555 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732787 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.732918 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733036 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733177 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733306 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733432 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733552 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733669 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733827 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.733996 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734128 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734246 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734366 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734483 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734619 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734797 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.734949 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735069 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735196 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735344 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735473 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735598 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735751 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.735909 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736084 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736211 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736328 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736460 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736577 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736806 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.736946 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737069 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737192 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737309 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737428 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737559 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737688 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.737881 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738005 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738155 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738288 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738410 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738529 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738647 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738823 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.738946 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739082 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739204 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739329 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739446 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739565 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739732 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739862 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.739981 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740095 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740210 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740345 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740466 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740584 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740734 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740868 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.740983 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741120 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741242 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741365 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741482 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741595 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741808 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.741936 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742058 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742215 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742339 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742605 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742775 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742911 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743035 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743156 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743297 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743423 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743546 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743666 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.742468 4717 manager.go:324] Recovery completed Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.743830 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744551 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744619 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744647 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744675 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744736 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744772 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744807 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744838 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744915 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744949 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.744985 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745017 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745052 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745083 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745113 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745143 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745175 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745204 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745234 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745267 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745297 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745326 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745357 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745387 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745418 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745449 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745478 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745513 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745546 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745578 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745612 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745642 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745671 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745798 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745827 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745855 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745876 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745898 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745920 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745943 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745968 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.745988 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746011 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746031 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746054 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746078 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746100 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746123 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746145 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746200 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746223 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746249 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746277 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746307 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746335 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746363 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746390 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746418 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746445 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746472 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746500 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746526 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746553 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746582 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746609 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746635 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746662 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746728 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746762 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746791 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746819 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746847 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746875 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746902 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746933 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746963 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.746992 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747021 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747050 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747076 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747104 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747133 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747561 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747595 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747623 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747653 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747679 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747746 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747779 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747806 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747834 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747858 4717 reconstruct.go:97] "Volume reconstruction finished" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.747873 4717 reconciler.go:26] "Reconciler: start to sync state" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.757914 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.760233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.760302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.760319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.761178 4717 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.761202 4717 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.761253 4717 state_mem.go:36] "Initialized new in-memory state store" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.777955 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.780288 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.780341 4717 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.780376 4717 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.780433 4717 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.782343 4717 policy_none.go:49] "None policy: Start" Mar 08 05:26:13 crc kubenswrapper[4717]: W0308 05:26:13.784490 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.785157 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.785886 4717 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.785919 4717 state_mem.go:35] "Initializing new in-memory state store" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.807338 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.833293 4717 manager.go:334] "Starting Device Plugin manager" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.833402 4717 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.833428 4717 server.go:79] "Starting device plugin registration server" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.834256 4717 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.834285 4717 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.834693 4717 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.834992 4717 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.835112 4717 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.842619 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.880558 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.880702 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.882114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.882186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.882200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.882401 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883478 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883853 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883887 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.883743 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.884979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.887339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.887516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.887670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.888272 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.888476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.888544 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893356 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.893855 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.894031 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.894091 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895611 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.895654 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.897066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.897113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.897129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.908331 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.934742 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.936266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.936299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.936310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.936335 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:13 crc kubenswrapper[4717]: E0308 05:26:13.936943 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.949617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.949679 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.949756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.949821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.949908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950204 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950241 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950671 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:13 crc kubenswrapper[4717]: I0308 05:26:13.950881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052610 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052748 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052873 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052958 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.052994 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053052 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053075 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053082 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053154 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053240 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.053300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.137623 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.139621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.139730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.139751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.139796 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.140514 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.211942 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.218923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.234197 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.253402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.258009 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.275764 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-32e1dcaa74b4d5a1e1c4a61bc43c5d8602c6dbc82f4517121f97d05340a16ca8 WatchSource:0}: Error finding container 32e1dcaa74b4d5a1e1c4a61bc43c5d8602c6dbc82f4517121f97d05340a16ca8: Status 404 returned error can't find the container with id 32e1dcaa74b4d5a1e1c4a61bc43c5d8602c6dbc82f4517121f97d05340a16ca8 Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.281099 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-517208b9b48a0554172d18717c7bd481b9e8ea1fbca67a9f5bb68c3180acd2bf WatchSource:0}: Error finding container 517208b9b48a0554172d18717c7bd481b9e8ea1fbca67a9f5bb68c3180acd2bf: Status 404 returned error can't find the container with id 517208b9b48a0554172d18717c7bd481b9e8ea1fbca67a9f5bb68c3180acd2bf Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.290820 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5745dbf0e534f6ef63283437933e38c7a3fca1dcd490add10061deb93bdb2d24 WatchSource:0}: Error finding container 5745dbf0e534f6ef63283437933e38c7a3fca1dcd490add10061deb93bdb2d24: Status 404 returned error can't find the container with id 5745dbf0e534f6ef63283437933e38c7a3fca1dcd490add10061deb93bdb2d24 Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.293327 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f002eadbf0635c01df5d76df6f77679271d06fe642f96d3ec4287a878a8776cc WatchSource:0}: Error finding container f002eadbf0635c01df5d76df6f77679271d06fe642f96d3ec4287a878a8776cc: Status 404 returned error can't find the container with id f002eadbf0635c01df5d76df6f77679271d06fe642f96d3ec4287a878a8776cc Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.296185 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e1dd1ba1a867537cced12a1754063137aa5ed76baadea757d79438faed6978b6 WatchSource:0}: Error finding container e1dd1ba1a867537cced12a1754063137aa5ed76baadea757d79438faed6978b6: Status 404 returned error can't find the container with id e1dd1ba1a867537cced12a1754063137aa5ed76baadea757d79438faed6978b6 Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.309649 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.540720 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.543039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.543173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.543204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.543266 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.544152 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.553766 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.553911 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.702139 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.791295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5745dbf0e534f6ef63283437933e38c7a3fca1dcd490add10061deb93bdb2d24"} Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.793750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"517208b9b48a0554172d18717c7bd481b9e8ea1fbca67a9f5bb68c3180acd2bf"} Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.795828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"32e1dcaa74b4d5a1e1c4a61bc43c5d8602c6dbc82f4517121f97d05340a16ca8"} Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.797322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f002eadbf0635c01df5d76df6f77679271d06fe642f96d3ec4287a878a8776cc"} Mar 08 05:26:14 crc kubenswrapper[4717]: I0308 05:26:14.799176 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e1dd1ba1a867537cced12a1754063137aa5ed76baadea757d79438faed6978b6"} Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.888370 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.888590 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:14 crc kubenswrapper[4717]: W0308 05:26:14.971955 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:14 crc kubenswrapper[4717]: E0308 05:26:14.972097 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:15 crc kubenswrapper[4717]: E0308 05:26:15.111532 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Mar 08 05:26:15 crc kubenswrapper[4717]: W0308 05:26:15.341438 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:15 crc kubenswrapper[4717]: E0308 05:26:15.341542 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.344972 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.347312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.347370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.347389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.347424 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:15 crc kubenswrapper[4717]: E0308 05:26:15.347921 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.696513 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 05:26:15 crc kubenswrapper[4717]: E0308 05:26:15.697579 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.701900 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.805401 4717 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f" exitCode=0 Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.805501 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.805572 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.806884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.806932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.806945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.808991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.809029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.809046 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.809061 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.809476 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.810906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.810929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.810941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.811560 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8" exitCode=0 Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.811649 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.811749 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813239 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514" exitCode=0 Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813359 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.813492 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814666 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace" exitCode=0 Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814743 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace"} Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.814809 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.815738 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.815920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.815966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.816012 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.816414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.816437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:15 crc kubenswrapper[4717]: I0308 05:26:15.816471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.701448 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:16 crc kubenswrapper[4717]: E0308 05:26:16.712580 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.818061 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.818258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b2545fdec6c746926828dba3c09dbe50ee9ee6551b533660dc8b1970df5395db"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.818768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.819509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.819523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.823217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.823295 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.823295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.823415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.824481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.824511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.824521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.826627 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.826675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.826708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.826720 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828029 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764" exitCode=0 Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828130 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828373 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764"} Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828439 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.828930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.829188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.829212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.829220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.948461 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.949611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.949649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.949658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:16 crc kubenswrapper[4717]: I0308 05:26:16.949686 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:16 crc kubenswrapper[4717]: E0308 05:26:16.950168 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.117526 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:17 crc kubenswrapper[4717]: W0308 05:26:17.117897 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:17 crc kubenswrapper[4717]: E0308 05:26:17.117961 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:17 crc kubenswrapper[4717]: W0308 05:26:17.185125 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Mar 08 05:26:17 crc kubenswrapper[4717]: E0308 05:26:17.185245 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.831942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"752a0e8bd737a2214516a19f98ed484d6320233cbc73f1696f93707ea8cf918d"} Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.832090 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.833566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.833590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.833599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834068 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1" exitCode=0 Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834162 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834173 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834210 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834263 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1"} Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.834546 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.835746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.836496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.836541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:17 crc kubenswrapper[4717]: I0308 05:26:17.836562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.418541 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841153 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034"} Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4"} Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232"} Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6"} Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841271 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841330 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.841401 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.842535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.842573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.842585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.843565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.843602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:18 crc kubenswrapper[4717]: I0308 05:26:18.843614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.314093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.314292 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.315918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.315982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.316022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.396352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.405249 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.850300 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.850318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c"} Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.850510 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.852241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.852299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.852317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.853029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.853139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.853213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:19 crc kubenswrapper[4717]: I0308 05:26:19.988933 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.118587 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.118723 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.150961 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.153092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.153150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.153170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.153208 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.760542 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.760806 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.760865 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.762798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.762848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.762865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.852886 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.852982 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:20 crc kubenswrapper[4717]: I0308 05:26:20.854735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:21 crc kubenswrapper[4717]: I0308 05:26:21.448282 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:21 crc kubenswrapper[4717]: I0308 05:26:21.448514 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:21 crc kubenswrapper[4717]: I0308 05:26:21.449943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:21 crc kubenswrapper[4717]: I0308 05:26:21.449999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:21 crc kubenswrapper[4717]: I0308 05:26:21.450021 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:22 crc kubenswrapper[4717]: I0308 05:26:22.668077 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:22 crc kubenswrapper[4717]: I0308 05:26:22.668345 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:22 crc kubenswrapper[4717]: I0308 05:26:22.669960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:22 crc kubenswrapper[4717]: I0308 05:26:22.670010 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:22 crc kubenswrapper[4717]: I0308 05:26:22.670028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:23 crc kubenswrapper[4717]: I0308 05:26:23.147595 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 08 05:26:23 crc kubenswrapper[4717]: I0308 05:26:23.147926 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:23 crc kubenswrapper[4717]: I0308 05:26:23.149406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:23 crc kubenswrapper[4717]: I0308 05:26:23.149465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:23 crc kubenswrapper[4717]: I0308 05:26:23.149484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:23 crc kubenswrapper[4717]: E0308 05:26:23.842891 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.037315 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.037623 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.039296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.039352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.039375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.767974 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.768231 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.769881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.769936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.769955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.775549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.864298 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.865751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.865815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:24 crc kubenswrapper[4717]: I0308 05:26:24.865836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:27 crc kubenswrapper[4717]: W0308 05:26:27.253459 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.253554 4717 trace.go:236] Trace[307043712]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Mar-2026 05:26:17.252) (total time: 10001ms): Mar 08 05:26:27 crc kubenswrapper[4717]: Trace[307043712]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:26:27.253) Mar 08 05:26:27 crc kubenswrapper[4717]: Trace[307043712]: [10.001507782s] [10.001507782s] END Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.253576 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.678334 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.680039 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ac66cbfc374df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,LastTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.680643 4717 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:27 crc kubenswrapper[4717]: W0308 05:26:27.680950 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.681026 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.681525 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 05:26:27 crc kubenswrapper[4717]: W0308 05:26:27.683040 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.683110 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.685007 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.686696 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.686770 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 08 05:26:27 crc kubenswrapper[4717]: W0308 05:26:27.687953 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z Mar 08 05:26:27 crc kubenswrapper[4717]: E0308 05:26:27.688042 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.693879 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.693944 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.705890 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:27Z is after 2026-02-23T05:33:13Z Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.874036 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.875749 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="752a0e8bd737a2214516a19f98ed484d6320233cbc73f1696f93707ea8cf918d" exitCode=255 Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.875783 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"752a0e8bd737a2214516a19f98ed484d6320233cbc73f1696f93707ea8cf918d"} Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.875913 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.876710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.876750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.876759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:27 crc kubenswrapper[4717]: I0308 05:26:27.877495 4717 scope.go:117] "RemoveContainer" containerID="752a0e8bd737a2214516a19f98ed484d6320233cbc73f1696f93707ea8cf918d" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.682745 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.721271 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:28Z is after 2026-02-23T05:33:13Z Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.880499 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.880934 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.882573 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" exitCode=255 Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.882607 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea"} Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.882647 4717 scope.go:117] "RemoveContainer" containerID="752a0e8bd737a2214516a19f98ed484d6320233cbc73f1696f93707ea8cf918d" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.882771 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.883450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.883476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.883485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:28 crc kubenswrapper[4717]: I0308 05:26:28.883977 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:28 crc kubenswrapper[4717]: E0308 05:26:28.884218 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.705535 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:29Z is after 2026-02-23T05:33:13Z Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.889635 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.893983 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.895714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.895780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.895800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:29 crc kubenswrapper[4717]: I0308 05:26:29.896881 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:29 crc kubenswrapper[4717]: E0308 05:26:29.897182 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:30 crc kubenswrapper[4717]: I0308 05:26:30.119300 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:26:30 crc kubenswrapper[4717]: I0308 05:26:30.119492 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:26:30 crc kubenswrapper[4717]: I0308 05:26:30.704757 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:30Z is after 2026-02-23T05:33:13Z Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.448614 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.448957 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.450706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.450757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.450770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.451396 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:31 crc kubenswrapper[4717]: E0308 05:26:31.451605 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:31 crc kubenswrapper[4717]: I0308 05:26:31.708101 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:31Z is after 2026-02-23T05:33:13Z Mar 08 05:26:32 crc kubenswrapper[4717]: W0308 05:26:32.212145 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:32Z is after 2026-02-23T05:33:13Z Mar 08 05:26:32 crc kubenswrapper[4717]: E0308 05:26:32.212258 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:32 crc kubenswrapper[4717]: W0308 05:26:32.443264 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:32Z is after 2026-02-23T05:33:13Z Mar 08 05:26:32 crc kubenswrapper[4717]: E0308 05:26:32.443413 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.676911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.677052 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.678215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.678249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.678258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.678933 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:32 crc kubenswrapper[4717]: E0308 05:26:32.679092 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.683837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.708198 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:26:32Z is after 2026-02-23T05:33:13Z Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.902116 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.903333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.903373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.903382 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:32 crc kubenswrapper[4717]: I0308 05:26:32.903966 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:32 crc kubenswrapper[4717]: E0308 05:26:32.904143 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:33 crc kubenswrapper[4717]: I0308 05:26:33.708216 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:33 crc kubenswrapper[4717]: E0308 05:26:33.843125 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.078150 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.078463 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.082213 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.095107 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 08 05:26:34 crc kubenswrapper[4717]: E0308 05:26:34.095245 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.119246 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:34 crc kubenswrapper[4717]: E0308 05:26:34.123911 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.713655 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.908094 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.909739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.909813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:34 crc kubenswrapper[4717]: I0308 05:26:34.909832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:34 crc kubenswrapper[4717]: W0308 05:26:34.943374 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 05:26:34 crc kubenswrapper[4717]: E0308 05:26:34.943456 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:35 crc kubenswrapper[4717]: I0308 05:26:35.708792 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:36 crc kubenswrapper[4717]: I0308 05:26:36.338096 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 05:26:36 crc kubenswrapper[4717]: I0308 05:26:36.361346 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 05:26:36 crc kubenswrapper[4717]: I0308 05:26:36.708543 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.685806 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cbfc374df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,LastTimestamp:2026-03-08 05:26:13.699736799 +0000 UTC m=+0.617385693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.689757 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.693881 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.698637 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: I0308 05:26:37.702892 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.703248 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc7f28f81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.837041537 +0000 UTC m=+0.754690391,LastTimestamp:2026-03-08 05:26:13.837041537 +0000 UTC m=+0.754690391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.709882 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.882165398 +0000 UTC m=+0.799814252,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.713814 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.882195568 +0000 UTC m=+0.799844422,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.717486 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.882208869 +0000 UTC m=+0.799857733,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.723635 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.883501321 +0000 UTC m=+0.801150175,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.727832 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.883520271 +0000 UTC m=+0.801169125,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.733964 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.883535412 +0000 UTC m=+0.801184266,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.740101 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.884903986 +0000 UTC m=+0.802552840,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.744717 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.884918666 +0000 UTC m=+0.802567510,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.750054 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.884931646 +0000 UTC m=+0.802580490,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.753913 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.884941426 +0000 UTC m=+0.802590270,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.757806 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.884962417 +0000 UTC m=+0.802611261,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.761807 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.884988278 +0000 UTC m=+0.802637122,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.765325 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.88749565 +0000 UTC m=+0.805144534,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.769338 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.887657744 +0000 UTC m=+0.805306648,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.774222 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.887987422 +0000 UTC m=+0.805636306,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.778364 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.893340885 +0000 UTC m=+0.810989759,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.782653 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.893370516 +0000 UTC m=+0.811019400,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.786984 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc3600751\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc3600751 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760329553 +0000 UTC m=+0.677978417,LastTimestamp:2026-03-08 05:26:13.893389456 +0000 UTC m=+0.811038330,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.792720 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35f6548\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35f6548 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760288072 +0000 UTC m=+0.677936926,LastTimestamp:2026-03-08 05:26:13.89356071 +0000 UTC m=+0.811209574,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.798659 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ac66cc35fc79b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ac66cc35fc79b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:13.760313243 +0000 UTC m=+0.677962097,LastTimestamp:2026-03-08 05:26:13.893586881 +0000 UTC m=+0.811235745,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.806064 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66ce2b46b13 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.285953811 +0000 UTC m=+1.203602695,LastTimestamp:2026-03-08 05:26:14.285953811 +0000 UTC m=+1.203602695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.810642 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66ce2d9df52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.288408402 +0000 UTC m=+1.206057286,LastTimestamp:2026-03-08 05:26:14.288408402 +0000 UTC m=+1.206057286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.816662 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66ce3371ade openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.294518494 +0000 UTC m=+1.212167378,LastTimestamp:2026-03-08 05:26:14.294518494 +0000 UTC m=+1.212167378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.820151 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66ce362d139 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.297383225 +0000 UTC m=+1.215032119,LastTimestamp:2026-03-08 05:26:14.297383225 +0000 UTC m=+1.215032119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.825422 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66ce3ad3e28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.302260776 +0000 UTC m=+1.219909660,LastTimestamp:2026-03-08 05:26:14.302260776 +0000 UTC m=+1.219909660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.830831 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d08c0ae2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.924291627 +0000 UTC m=+1.841940521,LastTimestamp:2026-03-08 05:26:14.924291627 +0000 UTC m=+1.841940521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.836099 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d08c10d5e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.924315998 +0000 UTC m=+1.841964882,LastTimestamp:2026-03-08 05:26:14.924315998 +0000 UTC m=+1.841964882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.839990 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d08c33127 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.924456231 +0000 UTC m=+1.842105115,LastTimestamp:2026-03-08 05:26:14.924456231 +0000 UTC m=+1.842105115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.846869 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66d08c54a3d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.924593725 +0000 UTC m=+1.842242609,LastTimestamp:2026-03-08 05:26:14.924593725 +0000 UTC m=+1.842242609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.851336 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d08c641be openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.924657086 +0000 UTC m=+1.842305970,LastTimestamp:2026-03-08 05:26:14.924657086 +0000 UTC m=+1.842305970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.859343 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66d09722d26 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.935924006 +0000 UTC m=+1.853572850,LastTimestamp:2026-03-08 05:26:14.935924006 +0000 UTC m=+1.853572850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.864305 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d099300fd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.938075389 +0000 UTC m=+1.855724273,LastTimestamp:2026-03-08 05:26:14.938075389 +0000 UTC m=+1.855724273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.870181 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d09b38e34 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.940208692 +0000 UTC m=+1.857857536,LastTimestamp:2026-03-08 05:26:14.940208692 +0000 UTC m=+1.857857536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.874026 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d09e3a860 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.94336112 +0000 UTC m=+1.861009984,LastTimestamp:2026-03-08 05:26:14.94336112 +0000 UTC m=+1.861009984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.880217 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d09ebd62e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.943897134 +0000 UTC m=+1.861545988,LastTimestamp:2026-03-08 05:26:14.943897134 +0000 UTC m=+1.861545988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.884119 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d09fe545c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.945109084 +0000 UTC m=+1.862757938,LastTimestamp:2026-03-08 05:26:14.945109084 +0000 UTC m=+1.862757938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.888472 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d1b3efbf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.23455896 +0000 UTC m=+2.152207854,LastTimestamp:2026-03-08 05:26:15.23455896 +0000 UTC m=+2.152207854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.892206 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d1c372ff4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.250825204 +0000 UTC m=+2.168474088,LastTimestamp:2026-03-08 05:26:15.250825204 +0000 UTC m=+2.168474088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.898302 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d1c4ebd2a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.252368682 +0000 UTC m=+2.170017566,LastTimestamp:2026-03-08 05:26:15.252368682 +0000 UTC m=+2.170017566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.903851 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d279b4c94 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.441935508 +0000 UTC m=+2.359584352,LastTimestamp:2026-03-08 05:26:15.441935508 +0000 UTC m=+2.359584352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.907194 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d28262950 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.451035984 +0000 UTC m=+2.368684868,LastTimestamp:2026-03-08 05:26:15.451035984 +0000 UTC m=+2.368684868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.911860 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d2836e6b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.452133041 +0000 UTC m=+2.369781875,LastTimestamp:2026-03-08 05:26:15.452133041 +0000 UTC m=+2.369781875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.917669 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d345b2b82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.655836546 +0000 UTC m=+2.573485420,LastTimestamp:2026-03-08 05:26:15.655836546 +0000 UTC m=+2.573485420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.924214 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d3531e703 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.669909251 +0000 UTC m=+2.587558125,LastTimestamp:2026-03-08 05:26:15.669909251 +0000 UTC m=+2.587558125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.928565 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d3d77fc21 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.808719905 +0000 UTC m=+2.726368759,LastTimestamp:2026-03-08 05:26:15.808719905 +0000 UTC m=+2.726368759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.936134 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d3de04922 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.815555362 +0000 UTC m=+2.733204206,LastTimestamp:2026-03-08 05:26:15.815555362 +0000 UTC m=+2.733204206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.942771 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d3def0344 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.816520516 +0000 UTC m=+2.734169390,LastTimestamp:2026-03-08 05:26:15.816520516 +0000 UTC m=+2.734169390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.949201 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66d3e227e89 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.819894409 +0000 UTC m=+2.737543293,LastTimestamp:2026-03-08 05:26:15.819894409 +0000 UTC m=+2.737543293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.955981 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d4ce0c778 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.067245944 +0000 UTC m=+2.984894788,LastTimestamp:2026-03-08 05:26:16.067245944 +0000 UTC m=+2.984894788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.960314 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d4ce0e439 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.067253305 +0000 UTC m=+2.984902149,LastTimestamp:2026-03-08 05:26:16.067253305 +0000 UTC m=+2.984902149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.964474 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66d4ce8490b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.067737867 +0000 UTC m=+2.985386711,LastTimestamp:2026-03-08 05:26:16.067737867 +0000 UTC m=+2.985386711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.968630 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d4ce9739a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.067814298 +0000 UTC m=+2.985463152,LastTimestamp:2026-03-08 05:26:16.067814298 +0000 UTC m=+2.985463152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.973774 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d4dccf1a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.082723234 +0000 UTC m=+3.000372078,LastTimestamp:2026-03-08 05:26:16.082723234 +0000 UTC m=+3.000372078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.978889 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d4ddc61cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.083734989 +0000 UTC m=+3.001383843,LastTimestamp:2026-03-08 05:26:16.083734989 +0000 UTC m=+3.001383843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.983572 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d4de0e61e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.084031006 +0000 UTC m=+3.001679850,LastTimestamp:2026-03-08 05:26:16.084031006 +0000 UTC m=+3.001679850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.987065 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d4de95910 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.08458472 +0000 UTC m=+3.002233564,LastTimestamp:2026-03-08 05:26:16.08458472 +0000 UTC m=+3.002233564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.992869 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ac66d4e068a2d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.086497837 +0000 UTC m=+3.004146691,LastTimestamp:2026-03-08 05:26:16.086497837 +0000 UTC m=+3.004146691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.995734 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d4e16cc9f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.087563423 +0000 UTC m=+3.005212277,LastTimestamp:2026-03-08 05:26:16.087563423 +0000 UTC m=+3.005212277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:37 crc kubenswrapper[4717]: E0308 05:26:37.998426 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d5ab32a9b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.299137691 +0000 UTC m=+3.216786535,LastTimestamp:2026-03-08 05:26:16.299137691 +0000 UTC m=+3.216786535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.000028 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d5acb5264 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.30072074 +0000 UTC m=+3.218369584,LastTimestamp:2026-03-08 05:26:16.30072074 +0000 UTC m=+3.218369584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.003396 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d5b3fbbdd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.308349917 +0000 UTC m=+3.225998761,LastTimestamp:2026-03-08 05:26:16.308349917 +0000 UTC m=+3.225998761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.008159 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d5b516f07 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.309509895 +0000 UTC m=+3.227158739,LastTimestamp:2026-03-08 05:26:16.309509895 +0000 UTC m=+3.227158739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.013060 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d5bde89fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.318757372 +0000 UTC m=+3.236406216,LastTimestamp:2026-03-08 05:26:16.318757372 +0000 UTC m=+3.236406216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.017555 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d5bf5bc69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.320277609 +0000 UTC m=+3.237926453,LastTimestamp:2026-03-08 05:26:16.320277609 +0000 UTC m=+3.237926453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.023136 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d651447db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.473274331 +0000 UTC m=+3.390923175,LastTimestamp:2026-03-08 05:26:16.473274331 +0000 UTC m=+3.390923175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.029543 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d6535777e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.475449214 +0000 UTC m=+3.393098078,LastTimestamp:2026-03-08 05:26:16.475449214 +0000 UTC m=+3.393098078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.034430 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ac66d65d364d8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.485799128 +0000 UTC m=+3.403447972,LastTimestamp:2026-03-08 05:26:16.485799128 +0000 UTC m=+3.403447972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.039250 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d66d19705 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.502458117 +0000 UTC m=+3.420106981,LastTimestamp:2026-03-08 05:26:16.502458117 +0000 UTC m=+3.420106981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.044090 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d66e580f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.503763189 +0000 UTC m=+3.421412033,LastTimestamp:2026-03-08 05:26:16.503763189 +0000 UTC m=+3.421412033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.048576 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d7238e64e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.693777998 +0000 UTC m=+3.611426842,LastTimestamp:2026-03-08 05:26:16.693777998 +0000 UTC m=+3.611426842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.054826 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d72e81648 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.70525908 +0000 UTC m=+3.622907934,LastTimestamp:2026-03-08 05:26:16.70525908 +0000 UTC m=+3.622907934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.059997 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d72f8393e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.706316606 +0000 UTC m=+3.623965460,LastTimestamp:2026-03-08 05:26:16.706316606 +0000 UTC m=+3.623965460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.065639 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d7a5d3244 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.830374468 +0000 UTC m=+3.748023312,LastTimestamp:2026-03-08 05:26:16.830374468 +0000 UTC m=+3.748023312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.069603 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d7d56dfc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.880291782 +0000 UTC m=+3.797940626,LastTimestamp:2026-03-08 05:26:16.880291782 +0000 UTC m=+3.797940626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.073769 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d7ddaab00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.888929024 +0000 UTC m=+3.806577868,LastTimestamp:2026-03-08 05:26:16.888929024 +0000 UTC m=+3.806577868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.081234 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d853ea5a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:17.012921764 +0000 UTC m=+3.930570608,LastTimestamp:2026-03-08 05:26:17.012921764 +0000 UTC m=+3.930570608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.086425 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66d85c831cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:17.021936076 +0000 UTC m=+3.939584920,LastTimestamp:2026-03-08 05:26:17.021936076 +0000 UTC m=+3.939584920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.091172 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66db6742c43 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:17.838513219 +0000 UTC m=+4.756162063,LastTimestamp:2026-03-08 05:26:17.838513219 +0000 UTC m=+4.756162063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.098768 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dc2015906 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.03231463 +0000 UTC m=+4.949963504,LastTimestamp:2026-03-08 05:26:18.03231463 +0000 UTC m=+4.949963504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.105145 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dc2c43d08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.045086984 +0000 UTC m=+4.962735868,LastTimestamp:2026-03-08 05:26:18.045086984 +0000 UTC m=+4.962735868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.110227 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dc2d6c283 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.046300803 +0000 UTC m=+4.963949687,LastTimestamp:2026-03-08 05:26:18.046300803 +0000 UTC m=+4.963949687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.114995 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dd258543d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.306450493 +0000 UTC m=+5.224099377,LastTimestamp:2026-03-08 05:26:18.306450493 +0000 UTC m=+5.224099377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.120110 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dd31b1250 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.319213136 +0000 UTC m=+5.236862010,LastTimestamp:2026-03-08 05:26:18.319213136 +0000 UTC m=+5.236862010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.125015 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dd334535a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.320868186 +0000 UTC m=+5.238517070,LastTimestamp:2026-03-08 05:26:18.320868186 +0000 UTC m=+5.238517070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.130100 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66de1ee35ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.567931375 +0000 UTC m=+5.485580259,LastTimestamp:2026-03-08 05:26:18.567931375 +0000 UTC m=+5.485580259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.135057 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66de2aa7261 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.580267617 +0000 UTC m=+5.497916491,LastTimestamp:2026-03-08 05:26:18.580267617 +0000 UTC m=+5.497916491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.139544 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66de2bf5ad1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.581637841 +0000 UTC m=+5.499286725,LastTimestamp:2026-03-08 05:26:18.581637841 +0000 UTC m=+5.499286725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.145178 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66df1204dd9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.822872537 +0000 UTC m=+5.740521391,LastTimestamp:2026-03-08 05:26:18.822872537 +0000 UTC m=+5.740521391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.150317 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66df1c5d453 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.833720403 +0000 UTC m=+5.751369267,LastTimestamp:2026-03-08 05:26:18.833720403 +0000 UTC m=+5.751369267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.156326 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66df1d729f8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:18.83485644 +0000 UTC m=+5.752505294,LastTimestamp:2026-03-08 05:26:18.83485644 +0000 UTC m=+5.752505294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.161234 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dff0d2092 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:19.056496786 +0000 UTC m=+5.974145670,LastTimestamp:2026-03-08 05:26:19.056496786 +0000 UTC m=+5.974145670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.165288 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ac66dffbd4a01 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:19.068041729 +0000 UTC m=+5.985690583,LastTimestamp:2026-03-08 05:26:19.068041729 +0000 UTC m=+5.985690583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.171190 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 05:26:38 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-controller-manager-crc.189ac66e3e5c6b4f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 08 05:26:38 crc kubenswrapper[4717]: body: Mar 08 05:26:38 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:20.118657871 +0000 UTC m=+7.036306755,LastTimestamp:2026-03-08 05:26:20.118657871 +0000 UTC m=+7.036306755,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:38 crc kubenswrapper[4717]: > Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.175637 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66e3e5e1006 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:20.118765574 +0000 UTC m=+7.036414458,LastTimestamp:2026-03-08 05:26:20.118765574 +0000 UTC m=+7.036414458,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.181618 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 05:26:38 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-apiserver-crc.189ac67001744e60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 05:26:38 crc kubenswrapper[4717]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 05:26:38 crc kubenswrapper[4717]: Mar 08 05:26:38 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:27.686747744 +0000 UTC m=+14.604396588,LastTimestamp:2026-03-08 05:26:27.686747744 +0000 UTC m=+14.604396588,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:38 crc kubenswrapper[4717]: > Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.185893 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac67001750f13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:27.686797075 +0000 UTC m=+14.604445919,LastTimestamp:2026-03-08 05:26:27.686797075 +0000 UTC m=+14.604445919,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.189579 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ac67001744e60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 05:26:38 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-apiserver-crc.189ac67001744e60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 05:26:38 crc kubenswrapper[4717]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 05:26:38 crc kubenswrapper[4717]: Mar 08 05:26:38 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:27.686747744 +0000 UTC m=+14.604396588,LastTimestamp:2026-03-08 05:26:27.69392622 +0000 UTC m=+14.611575074,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:38 crc kubenswrapper[4717]: > Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.193500 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ac67001750f13\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac67001750f13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:27.686797075 +0000 UTC m=+14.604445919,LastTimestamp:2026-03-08 05:26:27.693967601 +0000 UTC m=+14.611616445,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.199485 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ac66d72f8393e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d72f8393e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.706316606 +0000 UTC m=+3.623965460,LastTimestamp:2026-03-08 05:26:27.878345202 +0000 UTC m=+14.795994046,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.204227 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ac66d7d56dfc6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d7d56dfc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.880291782 +0000 UTC m=+3.797940626,LastTimestamp:2026-03-08 05:26:28.033994669 +0000 UTC m=+14.951643513,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.208614 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ac66d7ddaab00\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ac66d7ddaab00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:16.888929024 +0000 UTC m=+3.806577868,LastTimestamp:2026-03-08 05:26:28.043751838 +0000 UTC m=+14.961400682,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.213938 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 05:26:38 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-controller-manager-crc.189ac67092742df8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 05:26:38 crc kubenswrapper[4717]: body: Mar 08 05:26:38 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.119435768 +0000 UTC m=+17.037084652,LastTimestamp:2026-03-08 05:26:30.119435768 +0000 UTC m=+17.037084652,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:38 crc kubenswrapper[4717]: > Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.218178 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac6709275ce9e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.11954243 +0000 UTC m=+17.037191314,LastTimestamp:2026-03-08 05:26:30.11954243 +0000 UTC m=+17.037191314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.682395 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.682649 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.684278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.684339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.684358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.685325 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:38 crc kubenswrapper[4717]: E0308 05:26:38.685611 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:38 crc kubenswrapper[4717]: I0308 05:26:38.708355 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:39 crc kubenswrapper[4717]: W0308 05:26:39.709076 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 05:26:39 crc kubenswrapper[4717]: E0308 05:26:39.709153 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:39 crc kubenswrapper[4717]: I0308 05:26:39.709199 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:39 crc kubenswrapper[4717]: W0308 05:26:39.872781 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:39 crc kubenswrapper[4717]: E0308 05:26:39.872874 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.119187 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.119349 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.119429 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.119665 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.121566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.121631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.121650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.122616 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.122939 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03" gracePeriod=30 Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.130335 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac67092742df8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 05:26:40 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-controller-manager-crc.189ac67092742df8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 05:26:40 crc kubenswrapper[4717]: body: Mar 08 05:26:40 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.119435768 +0000 UTC m=+17.037084652,LastTimestamp:2026-03-08 05:26:40.119304051 +0000 UTC m=+27.036952935,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:40 crc kubenswrapper[4717]: > Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.137758 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac6709275ce9e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac6709275ce9e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.11954243 +0000 UTC m=+17.037191314,LastTimestamp:2026-03-08 05:26:40.119390364 +0000 UTC m=+27.037039238,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.141618 4717 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac672e6b52066 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:40.12291287 +0000 UTC m=+27.040561744,LastTimestamp:2026-03-08 05:26:40.12291287 +0000 UTC m=+27.040561744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:40 crc kubenswrapper[4717]: W0308 05:26:40.256217 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.256301 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.256292 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac66d09b38e34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d09b38e34 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:14.940208692 +0000 UTC m=+1.857857536,LastTimestamp:2026-03-08 05:26:40.248471049 +0000 UTC m=+27.166119923,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.510925 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac66d1b3efbf0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d1b3efbf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.23455896 +0000 UTC m=+2.152207854,LastTimestamp:2026-03-08 05:26:40.50175813 +0000 UTC m=+27.419407004,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:40 crc kubenswrapper[4717]: E0308 05:26:40.523359 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac66d1c372ff4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac66d1c372ff4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:15.250825204 +0000 UTC m=+2.168474088,LastTimestamp:2026-03-08 05:26:40.515374694 +0000 UTC m=+27.433023568,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.709146 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.930732 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.931330 4717 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03" exitCode=255 Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.931379 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03"} Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.931452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6"} Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.931645 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.933059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.933107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:40 crc kubenswrapper[4717]: I0308 05:26:40.933126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:41 crc kubenswrapper[4717]: E0308 05:26:41.101104 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.124792 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.126535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.126593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.126615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.126657 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:41 crc kubenswrapper[4717]: E0308 05:26:41.131935 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:26:41 crc kubenswrapper[4717]: I0308 05:26:41.708019 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:42 crc kubenswrapper[4717]: I0308 05:26:42.709465 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:43 crc kubenswrapper[4717]: I0308 05:26:43.706099 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:43 crc kubenswrapper[4717]: E0308 05:26:43.843323 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.707726 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.766985 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.767152 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.768515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.768577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:44 crc kubenswrapper[4717]: I0308 05:26:44.768595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:45 crc kubenswrapper[4717]: I0308 05:26:45.708210 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:46 crc kubenswrapper[4717]: I0308 05:26:46.706239 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.117623 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.117805 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.119459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.119488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.119497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:47 crc kubenswrapper[4717]: I0308 05:26:47.705499 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:48 crc kubenswrapper[4717]: E0308 05:26:48.106977 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.133000 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.134875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.134935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.134945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.134969 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:48 crc kubenswrapper[4717]: E0308 05:26:48.141637 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:26:48 crc kubenswrapper[4717]: I0308 05:26:48.708421 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:49 crc kubenswrapper[4717]: I0308 05:26:49.708569 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:50 crc kubenswrapper[4717]: I0308 05:26:50.117992 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:26:50 crc kubenswrapper[4717]: I0308 05:26:50.118050 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:26:50 crc kubenswrapper[4717]: E0308 05:26:50.124900 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac67092742df8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 05:26:50 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-controller-manager-crc.189ac67092742df8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 05:26:50 crc kubenswrapper[4717]: body: Mar 08 05:26:50 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.119435768 +0000 UTC m=+17.037084652,LastTimestamp:2026-03-08 05:26:50.118036334 +0000 UTC m=+37.035685178,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:26:50 crc kubenswrapper[4717]: > Mar 08 05:26:50 crc kubenswrapper[4717]: E0308 05:26:50.131324 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac6709275ce9e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ac6709275ce9e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.11954243 +0000 UTC m=+17.037191314,LastTimestamp:2026-03-08 05:26:50.118072695 +0000 UTC m=+37.035721539,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:26:50 crc kubenswrapper[4717]: I0308 05:26:50.708709 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:51 crc kubenswrapper[4717]: W0308 05:26:51.592311 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 05:26:51 crc kubenswrapper[4717]: E0308 05:26:51.592382 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:51 crc kubenswrapper[4717]: I0308 05:26:51.708304 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:52 crc kubenswrapper[4717]: I0308 05:26:52.708884 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.708890 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.780840 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.782557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.782611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.782628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:53 crc kubenswrapper[4717]: I0308 05:26:53.783489 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:53 crc kubenswrapper[4717]: E0308 05:26:53.843596 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.707932 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.976900 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.979883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93"} Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.980094 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.981258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.981302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:54 crc kubenswrapper[4717]: I0308 05:26:54.981315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:55 crc kubenswrapper[4717]: E0308 05:26:55.117805 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.142857 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.144675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.144767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.144785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.144823 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:26:55 crc kubenswrapper[4717]: E0308 05:26:55.151488 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.709206 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.985175 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.986148 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.989368 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" exitCode=255 Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.989421 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93"} Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.989472 4717 scope.go:117] "RemoveContainer" containerID="a045c729d0463db0a6fc979cec11ba5b0bbc5032ad935bee3a7dc299df2dc5ea" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.989726 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.991607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.991711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.991741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:55 crc kubenswrapper[4717]: I0308 05:26:55.992730 4717 scope.go:117] "RemoveContainer" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" Mar 08 05:26:55 crc kubenswrapper[4717]: E0308 05:26:55.993105 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:56 crc kubenswrapper[4717]: W0308 05:26:56.054286 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 05:26:56 crc kubenswrapper[4717]: E0308 05:26:56.054368 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 05:26:56 crc kubenswrapper[4717]: I0308 05:26:56.709151 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:56 crc kubenswrapper[4717]: I0308 05:26:56.995872 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 05:26:57 crc kubenswrapper[4717]: I0308 05:26:57.710874 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.682739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.683820 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.685443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.685526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.685550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.686594 4717 scope.go:117] "RemoveContainer" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" Mar 08 05:26:58 crc kubenswrapper[4717]: E0308 05:26:58.686929 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:26:58 crc kubenswrapper[4717]: I0308 05:26:58.708647 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:26:59 crc kubenswrapper[4717]: I0308 05:26:59.707519 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:00 crc kubenswrapper[4717]: I0308 05:27:00.118922 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:27:00 crc kubenswrapper[4717]: I0308 05:27:00.119231 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:27:00 crc kubenswrapper[4717]: E0308 05:27:00.125250 4717 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ac67092742df8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 05:27:00 crc kubenswrapper[4717]: &Event{ObjectMeta:{kube-controller-manager-crc.189ac67092742df8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 05:27:00 crc kubenswrapper[4717]: body: Mar 08 05:27:00 crc kubenswrapper[4717]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:26:30.119435768 +0000 UTC m=+17.037084652,LastTimestamp:2026-03-08 05:27:00.119204645 +0000 UTC m=+47.036853489,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 05:27:00 crc kubenswrapper[4717]: > Mar 08 05:27:00 crc kubenswrapper[4717]: I0308 05:27:00.709613 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:00 crc kubenswrapper[4717]: W0308 05:27:00.998482 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 05:27:00 crc kubenswrapper[4717]: E0308 05:27:00.998560 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 05:27:01 crc kubenswrapper[4717]: W0308 05:27:01.090442 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:01 crc kubenswrapper[4717]: E0308 05:27:01.090863 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.449286 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.449540 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.451066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.451132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.451152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.452064 4717 scope.go:117] "RemoveContainer" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" Mar 08 05:27:01 crc kubenswrapper[4717]: E0308 05:27:01.452359 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:01 crc kubenswrapper[4717]: I0308 05:27:01.707493 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:02 crc kubenswrapper[4717]: E0308 05:27:02.124568 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.152436 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.153950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.153989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.153998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.154023 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:27:02 crc kubenswrapper[4717]: E0308 05:27:02.160567 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:27:02 crc kubenswrapper[4717]: I0308 05:27:02.706633 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:03 crc kubenswrapper[4717]: I0308 05:27:03.710846 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:03 crc kubenswrapper[4717]: E0308 05:27:03.844822 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:27:04 crc kubenswrapper[4717]: I0308 05:27:04.708638 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:05 crc kubenswrapper[4717]: I0308 05:27:05.705764 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:06 crc kubenswrapper[4717]: I0308 05:27:06.709542 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.710772 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.986914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.987174 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.988826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.988879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.988904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:07 crc kubenswrapper[4717]: I0308 05:27:07.994596 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:27:08 crc kubenswrapper[4717]: I0308 05:27:08.031772 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:08 crc kubenswrapper[4717]: I0308 05:27:08.033025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:08 crc kubenswrapper[4717]: I0308 05:27:08.033104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:08 crc kubenswrapper[4717]: I0308 05:27:08.033130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:08 crc kubenswrapper[4717]: I0308 05:27:08.714725 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:09 crc kubenswrapper[4717]: E0308 05:27:09.133039 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.160826 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.162226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.162288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.162310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.162349 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:27:09 crc kubenswrapper[4717]: E0308 05:27:09.169917 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.320656 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.320880 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.322353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.322407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.322428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:09 crc kubenswrapper[4717]: I0308 05:27:09.708003 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:10 crc kubenswrapper[4717]: I0308 05:27:10.707269 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:11 crc kubenswrapper[4717]: I0308 05:27:11.705706 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:12 crc kubenswrapper[4717]: I0308 05:27:12.706056 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:13 crc kubenswrapper[4717]: I0308 05:27:13.710784 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:13 crc kubenswrapper[4717]: E0308 05:27:13.845609 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:27:14 crc kubenswrapper[4717]: I0308 05:27:14.707480 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.707244 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.780807 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.782289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.782327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.782338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:15 crc kubenswrapper[4717]: I0308 05:27:15.782940 4717 scope.go:117] "RemoveContainer" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.058438 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.060721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965"} Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.060952 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.062484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.062515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.062535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:16 crc kubenswrapper[4717]: E0308 05:27:16.140561 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.170605 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.172557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.172620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.172643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.172712 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:27:16 crc kubenswrapper[4717]: E0308 05:27:16.178222 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 05:27:16 crc kubenswrapper[4717]: I0308 05:27:16.707958 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:17 crc kubenswrapper[4717]: I0308 05:27:17.706270 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.066765 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.067314 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.069067 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" exitCode=255 Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.069113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965"} Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.069152 4717 scope.go:117] "RemoveContainer" containerID="7c76bf0ebeec5282069197b4ad2b646975cdae97a598659669bc3bd3bc975c93" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.069428 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.070612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.070643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.070652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.071321 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:18 crc kubenswrapper[4717]: E0308 05:27:18.071538 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.217758 4717 csr.go:261] certificate signing request csr-cjlc7 is approved, waiting to be issued Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.228021 4717 csr.go:257] certificate signing request csr-cjlc7 is issued Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.320541 4717 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.557408 4717 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 05:27:18 crc kubenswrapper[4717]: I0308 05:27:18.681850 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.073053 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.075740 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.076798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.076858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.076876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.077848 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:19 crc kubenswrapper[4717]: E0308 05:27:19.078166 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.229466 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-15 10:57:13.400529124 +0000 UTC Mar 08 05:27:19 crc kubenswrapper[4717]: I0308 05:27:19.229731 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6053h29m54.170802403s for next certificate rotation Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.448995 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.449181 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.450517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.450546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.450561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:21 crc kubenswrapper[4717]: I0308 05:27:21.451306 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:21 crc kubenswrapper[4717]: E0308 05:27:21.451456 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.178424 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.180080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.180116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.180124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.180226 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.189133 4717 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.189416 4717 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.189437 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.193057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.193090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.193100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.193112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.193122 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:23Z","lastTransitionTime":"2026-03-08T05:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.212153 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.224660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.224709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.224717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.224733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.224742 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:23Z","lastTransitionTime":"2026-03-08T05:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.238151 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.245959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.245998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.246008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.246029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.246040 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:23Z","lastTransitionTime":"2026-03-08T05:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.258331 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.265946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.265993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.266002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.266018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:23 crc kubenswrapper[4717]: I0308 05:27:23.266029 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:23Z","lastTransitionTime":"2026-03-08T05:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.276165 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.276363 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.276400 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.376867 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.477932 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.578379 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.679495 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.780568 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.846548 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.881222 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:23 crc kubenswrapper[4717]: E0308 05:27:23.981965 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.082917 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.183293 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.283788 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.384945 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.485820 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.585915 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.686736 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.787792 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.888742 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:24 crc kubenswrapper[4717]: E0308 05:27:24.989603 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.090579 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.191713 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.292471 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.393128 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.493991 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.594091 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.694980 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.795429 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.896138 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:25 crc kubenswrapper[4717]: E0308 05:27:25.996645 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.098052 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.199700 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.299849 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.401358 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.501523 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.602833 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.703669 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.804270 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:26 crc kubenswrapper[4717]: E0308 05:27:26.904438 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.004635 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.104821 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.205451 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.305983 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.406092 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.507211 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.607417 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.707614 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.808536 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:27 crc kubenswrapper[4717]: E0308 05:27:27.909235 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.010067 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.111446 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.212481 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.313555 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.414024 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.514625 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.615054 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.715497 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.816034 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:28 crc kubenswrapper[4717]: E0308 05:27:28.917281 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.018614 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.119441 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.219911 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.320033 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.421217 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.522344 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.622878 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.724234 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: I0308 05:27:29.781844 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 05:27:29 crc kubenswrapper[4717]: I0308 05:27:29.783769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:29 crc kubenswrapper[4717]: I0308 05:27:29.784030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:29 crc kubenswrapper[4717]: I0308 05:27:29.784261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.825541 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:29 crc kubenswrapper[4717]: E0308 05:27:29.926137 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.026297 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.127177 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.227807 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.328911 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.429045 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.529450 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.629942 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.730883 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.831439 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:30 crc kubenswrapper[4717]: E0308 05:27:30.931518 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.032065 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.132284 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.233323 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.334041 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.435110 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.535472 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.636190 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.737152 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.837891 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:31 crc kubenswrapper[4717]: E0308 05:27:31.938446 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.039584 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.140388 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.241115 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.342877 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.443987 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.544161 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.644931 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.746102 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.847009 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:32 crc kubenswrapper[4717]: E0308 05:27:32.948133 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.002477 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.051669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.051973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.052100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.052220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.052336 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.155837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.155914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.155936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.155968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.155988 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.259760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.259824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.259838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.259858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.259871 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.318777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.318878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.318900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.318928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.318950 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.330236 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.335383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.335427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.335439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.335458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.335469 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.350715 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.355610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.355645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.355658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.355698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.355711 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.367635 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.372446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.372479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.372490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.372506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.372518 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.388351 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.393188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.393250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.393270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.393297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.393317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.409999 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.410164 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.412536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.412585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.412603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.412625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.412643 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.515957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.516023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.516042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.516069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.516088 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.618835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.618890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.618903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.618923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.618936 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.722184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.722269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.722292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.722326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.722349 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.726313 4717 apiserver.go:52] "Watching apiserver" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.733266 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.733750 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-fb27m","openshift-multus/multus-additional-cni-plugins-pkcrh","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-dns/node-resolver-qhwzg","openshift-multus/multus-d6f7j","openshift-multus/network-metrics-daemon-d64q9","openshift-image-registry/node-ca-6j4jn","openshift-machine-config-operator/machine-config-daemon-tb7pf","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb"] Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734156 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.734216 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734218 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.734269 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734520 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734922 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.734999 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.735152 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.735458 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.735488 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.735584 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.735839 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.735928 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.736615 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.736867 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.736961 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.745135 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.745384 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.745419 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.746989 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.747299 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.747538 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.747967 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.748169 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.748551 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.748863 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.749032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.749332 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.749673 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.750111 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.750159 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.750560 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752458 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752551 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752811 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752877 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752876 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752926 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753678 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752950 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752965 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.752517 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753126 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753166 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753190 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753181 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753196 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753215 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753249 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753402 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.753440 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.755867 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.778488 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.797406 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.808564 4717 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.815529 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818769 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818820 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.818981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819090 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819178 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819233 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819283 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819333 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819509 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819567 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819627 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819778 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819940 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819993 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820054 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820158 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820282 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820341 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820401 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820740 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820793 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820846 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820959 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821010 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821114 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821174 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819269 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.819383 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820022 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820069 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820232 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820262 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820767 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822277 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820779 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.820803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821668 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822455 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822496 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821799 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821913 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822481 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.822902 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823071 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.821353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823418 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823527 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823602 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823639 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823857 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823913 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.823963 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824039 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824077 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824116 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824159 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824323 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824341 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824546 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824612 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824670 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824836 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824893 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824993 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825096 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825150 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825208 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825267 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825388 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825441 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825492 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825548 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825608 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825814 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825873 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826146 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826273 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826354 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826471 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826525 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826643 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826797 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826862 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827033 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827085 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827229 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827285 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827395 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827448 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827506 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827616 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827681 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827770 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827821 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828103 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828165 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828221 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828276 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828338 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828453 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828508 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828563 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828674 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828887 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828946 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829002 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.828971 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829063 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829120 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829176 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829295 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829465 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829583 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829649 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829812 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829872 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830043 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830103 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830158 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830330 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830386 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830445 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830563 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830806 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830862 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830919 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831030 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831083 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831168 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831226 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831278 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831377 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831416 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831543 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824668 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.824812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825334 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.825801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.826869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831767 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831828 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831850 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831913 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831983 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832114 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832180 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832240 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832300 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832556 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832619 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832843 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5508bbd-d773-4b40-a641-e538e619bc1b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832973 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833114 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cnibin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-multus\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cni-binary-copy\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrklm\" (UniqueName: \"kubernetes.io/projected/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-kube-api-access-hrklm\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833516 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833612 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e996d1c-6f08-4f2d-a64b-e6f58300117d-serviceca\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833673 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833788 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833842 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddlf2\" (UniqueName: \"kubernetes.io/projected/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-kube-api-access-ddlf2\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-system-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r82qg\" (UniqueName: \"kubernetes.io/projected/9e996d1c-6f08-4f2d-a64b-e6f58300117d-kube-api-access-r82qg\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-rootfs\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgxsz\" (UniqueName: \"kubernetes.io/projected/95c5996b-1216-4f9c-bc1f-0ca06f8de088-kube-api-access-bgxsz\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834856 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834892 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834866 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835056 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835099 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-k8s-cni-cncf-io\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-kubelet\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e996d1c-6f08-4f2d-a64b-e6f58300117d-host\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835342 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835481 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btr2\" (UniqueName: \"kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835649 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwsgc\" (UniqueName: \"kubernetes.io/projected/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-kube-api-access-zwsgc\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835740 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-os-release\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-bin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835852 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a6f4d53-3a88-4caa-b66c-3254cd82186b-hosts-file\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbcs\" (UniqueName: \"kubernetes.io/projected/5a6f4d53-3a88-4caa-b66c-3254cd82186b-kube-api-access-hzbcs\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836219 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836270 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvft8\" (UniqueName: \"kubernetes.io/projected/a5508bbd-d773-4b40-a641-e538e619bc1b-kube-api-access-lvft8\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-proxy-tls\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836548 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-hostroot\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836829 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837008 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-multus-certs\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837515 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-socket-dir-parent\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837574 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-conf-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-os-release\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837776 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-daemon-config\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837810 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-etc-kubernetes\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837849 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cnibin\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837916 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-netns\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837997 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838024 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838047 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838070 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838093 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838114 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838140 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838162 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838183 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838206 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838229 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838252 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838274 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838296 4717 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838324 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838354 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838385 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838416 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838448 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838480 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838513 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838542 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838570 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842046 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842085 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842269 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842292 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842312 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842328 4717 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842345 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842359 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842370 4717 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842380 4717 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842392 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842404 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842415 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842426 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842438 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842453 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.829962 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830718 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830793 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830835 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843442 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830881 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.830207 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831149 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831086 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831413 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831495 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.831524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832583 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832641 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832666 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832965 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833560 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833429 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.832732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.833780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834011 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834189 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.834490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835054 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835358 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835800 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835819 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835835 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.835910 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836006 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836427 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836487 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836570 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836662 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837151 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837164 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.836852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837479 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837232 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.837672 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838460 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.838970 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839001 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839080 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839434 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839386 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839505 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839548 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.839849 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.840387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.840410 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.840463 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.841099 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.841174 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.841191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.841620 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842014 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842260 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842421 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.842859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843205 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843496 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843629 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843844 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.844020 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.843994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.844049 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.844470 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.844618 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845035 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845117 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845151 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845154 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845178 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.827965 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.845281 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.845505 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.345471141 +0000 UTC m=+81.263120025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.845851 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845897 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.845797 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.845945 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.345918122 +0000 UTC m=+81.263567176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.846163 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.846171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.846237 4717 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.846747 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.846912 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.847328 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.847332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.847967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.848300 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.848431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.849018 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.849035 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.849472 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.349445972 +0000 UTC m=+81.267095026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.850920 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.851187 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.851433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.851546 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.851832 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.855732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.858526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.859367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.860629 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.864345 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.865161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.865184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.865704 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.866659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.866950 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.866983 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.867003 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.868527 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.869706 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.869743 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.869828 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.369806498 +0000 UTC m=+81.287455352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.871516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.872553 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.372524877 +0000 UTC m=+81.290173811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.876935 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.876944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877084 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877268 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.877903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.878161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.878295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.878308 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.878483 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.878940 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.879007 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.879641 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.879811 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.879900 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.879988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.880424 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.880775 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.880852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881328 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881372 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881517 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881764 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881795 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.881945 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.882061 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.882054 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.886396 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.886709 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.886712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.886771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.888032 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.893250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.895644 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.895779 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.896063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.896819 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.897096 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.897235 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.897387 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.899200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.899826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.912288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.914931 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.920773 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.925851 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.925968 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.934433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.936991 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.938204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.938261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.938271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.938309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.938321 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:33Z","lastTransitionTime":"2026-03-08T05:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.943813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.943871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.943902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btr2\" (UniqueName: \"kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwsgc\" (UniqueName: \"kubernetes.io/projected/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-kube-api-access-zwsgc\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944761 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-bin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-os-release\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbcs\" (UniqueName: \"kubernetes.io/projected/5a6f4d53-3a88-4caa-b66c-3254cd82186b-kube-api-access-hzbcs\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944625 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944875 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-os-release\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-bin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.944986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a6f4d53-3a88-4caa-b66c-3254cd82186b-hosts-file\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945011 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvft8\" (UniqueName: \"kubernetes.io/projected/a5508bbd-d773-4b40-a641-e538e619bc1b-kube-api-access-lvft8\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945070 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945092 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-proxy-tls\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-hostroot\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945111 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a6f4d53-3a88-4caa-b66c-3254cd82186b-hosts-file\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945244 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-socket-dir-parent\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945261 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-conf-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-multus-certs\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945345 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-os-release\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945385 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-daemon-config\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945405 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-etc-kubernetes\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945421 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cnibin\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-netns\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5508bbd-d773-4b40-a641-e538e619bc1b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945577 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945584 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-os-release\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cnibin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945735 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-multus\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945786 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cni-binary-copy\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-hostroot\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrklm\" (UniqueName: \"kubernetes.io/projected/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-kube-api-access-hrklm\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945858 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945883 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945933 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddlf2\" (UniqueName: \"kubernetes.io/projected/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-kube-api-access-ddlf2\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-system-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e996d1c-6f08-4f2d-a64b-e6f58300117d-serviceca\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-rootfs\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgxsz\" (UniqueName: \"kubernetes.io/projected/95c5996b-1216-4f9c-bc1f-0ca06f8de088-kube-api-access-bgxsz\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946086 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r82qg\" (UniqueName: \"kubernetes.io/projected/9e996d1c-6f08-4f2d-a64b-e6f58300117d-kube-api-access-r82qg\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946158 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946209 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-k8s-cni-cncf-io\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-kubelet\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946258 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-daemon-config\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946263 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946288 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946294 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-etc-kubernetes\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946315 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e996d1c-6f08-4f2d-a64b-e6f58300117d-host\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-binary-copy\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946429 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946436 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946470 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946484 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-conf-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cnibin\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-multus-certs\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-multus-socket-dir-parent\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-cni-multus\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946538 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.945760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5508bbd-d773-4b40-a641-e538e619bc1b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946602 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-k8s-cni-cncf-io\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-var-lib-kubelet\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e996d1c-6f08-4f2d-a64b-e6f58300117d-host\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cnibin\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.946827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-host-run-netns\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947030 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95c5996b-1216-4f9c-bc1f-0ca06f8de088-system-cni-dir\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947025 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-rootfs\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947098 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-system-cni-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947269 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.947308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.947392 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: E0308 05:27:33.947523 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:34.447491728 +0000 UTC m=+81.365140582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948033 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948060 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948076 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948094 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948112 4717 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948132 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948147 4717 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948162 4717 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948181 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948141 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948201 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948242 4717 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948266 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948284 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948302 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948316 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948330 4717 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948344 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948358 4717 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9e996d1c-6f08-4f2d-a64b-e6f58300117d-serviceca\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948372 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948388 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948404 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948417 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948431 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948445 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948459 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948473 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948486 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948498 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948514 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948527 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948541 4717 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948555 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948566 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948581 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948594 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948606 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948618 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948631 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948644 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948656 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948669 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948710 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948799 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948891 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.948909 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950855 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950876 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950893 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950909 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.949542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-mcd-auth-proxy-config\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950926 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950096 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5508bbd-d773-4b40-a641-e538e619bc1b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950943 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950997 4717 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951014 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951028 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951040 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951055 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951070 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951084 4717 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951098 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951111 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951124 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951139 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951152 4717 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951165 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951201 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951215 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951228 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951241 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951254 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.950571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95c5996b-1216-4f9c-bc1f-0ca06f8de088-cni-binary-copy\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951267 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951280 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951293 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951306 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951318 4717 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951331 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951346 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951360 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951374 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951386 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951398 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951412 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951424 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951436 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951449 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951462 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951475 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951487 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951499 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951514 4717 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951527 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951539 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951551 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951563 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951574 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951587 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951599 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951611 4717 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951623 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951635 4717 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951648 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951660 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951674 4717 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951702 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951719 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951736 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951753 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951766 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951778 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951792 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951804 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951818 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951830 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951843 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951856 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951869 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951882 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951896 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951908 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951921 4717 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951935 4717 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951949 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951961 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951973 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951987 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.951999 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952011 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952023 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952035 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952049 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952061 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952075 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952087 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952099 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952112 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952124 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952136 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952148 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952160 4717 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952173 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952185 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952197 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952209 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952222 4717 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952236 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952249 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952263 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952277 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952290 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952302 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952316 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952328 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952340 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952352 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952365 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.952378 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.953048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.956099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.956239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.962403 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btr2\" (UniqueName: \"kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2\") pod \"ovnkube-node-fb27m\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.963589 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-proxy-tls\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.963833 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.967855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgxsz\" (UniqueName: \"kubernetes.io/projected/95c5996b-1216-4f9c-bc1f-0ca06f8de088-kube-api-access-bgxsz\") pod \"multus-d6f7j\" (UID: \"95c5996b-1216-4f9c-bc1f-0ca06f8de088\") " pod="openshift-multus/multus-d6f7j" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.968211 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvft8\" (UniqueName: \"kubernetes.io/projected/a5508bbd-d773-4b40-a641-e538e619bc1b-kube-api-access-lvft8\") pod \"ovnkube-control-plane-749d76644c-q67qb\" (UID: \"a5508bbd-d773-4b40-a641-e538e619bc1b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.971129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbcs\" (UniqueName: \"kubernetes.io/projected/5a6f4d53-3a88-4caa-b66c-3254cd82186b-kube-api-access-hzbcs\") pod \"node-resolver-qhwzg\" (UID: \"5a6f4d53-3a88-4caa-b66c-3254cd82186b\") " pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.971594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r82qg\" (UniqueName: \"kubernetes.io/projected/9e996d1c-6f08-4f2d-a64b-e6f58300117d-kube-api-access-r82qg\") pod \"node-ca-6j4jn\" (UID: \"9e996d1c-6f08-4f2d-a64b-e6f58300117d\") " pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.972212 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddlf2\" (UniqueName: \"kubernetes.io/projected/a5c6317f-efb5-4d91-b5df-c56e975f7c1c-kube-api-access-ddlf2\") pod \"multus-additional-cni-plugins-pkcrh\" (UID: \"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\") " pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.976302 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrklm\" (UniqueName: \"kubernetes.io/projected/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-kube-api-access-hrklm\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.977274 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwsgc\" (UniqueName: \"kubernetes.io/projected/7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e-kube-api-access-zwsgc\") pod \"machine-config-daemon-tb7pf\" (UID: \"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\") " pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.979749 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:33 crc kubenswrapper[4717]: I0308 05:27:33.992246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.010718 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.021844 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.033984 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.042538 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.043528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.043583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.043598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.043621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.043638 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.054162 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.065205 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.068002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.074264 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.077973 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.081350 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.085445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qhwzg" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.092091 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.093911 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.100816 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6f7j" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.102165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.105183 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6b4c667a9fd5b7986dd84e93e2cc7fca7ed6fd95a9ae8fb380d298a280bd8191 WatchSource:0}: Error finding container 6b4c667a9fd5b7986dd84e93e2cc7fca7ed6fd95a9ae8fb380d298a280bd8191: Status 404 returned error can't find the container with id 6b4c667a9fd5b7986dd84e93e2cc7fca7ed6fd95a9ae8fb380d298a280bd8191 Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.106484 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:34 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: source /etc/kubernetes/apiserver-url.env Mar 08 05:27:34 crc kubenswrapper[4717]: else Mar 08 05:27:34 crc kubenswrapper[4717]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 05:27:34 crc kubenswrapper[4717]: exit 1 Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.109604 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.109722 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6j4jn" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.113748 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.115313 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.117617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8e00324533fb4320eda80ca50e9a1621f826a1559d4fea58f4c87c3fb31c45c3"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.118896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.119734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6b4c667a9fd5b7986dd84e93e2cc7fca7ed6fd95a9ae8fb380d298a280bd8191"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.121401 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.124569 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a6f4d53_3a88_4caa_b66c_3254cd82186b.slice/crio-7293277ceeec1869b205f563b0aa95966807bf4d98b2c52cceee0157bd566a03 WatchSource:0}: Error finding container 7293277ceeec1869b205f563b0aa95966807bf4d98b2c52cceee0157bd566a03: Status 404 returned error can't find the container with id 7293277ceeec1869b205f563b0aa95966807bf4d98b2c52cceee0157bd566a03 Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.126300 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:34 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: source /etc/kubernetes/apiserver-url.env Mar 08 05:27:34 crc kubenswrapper[4717]: else Mar 08 05:27:34 crc kubenswrapper[4717]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 05:27:34 crc kubenswrapper[4717]: exit 1 Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.127892 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.129005 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.129577 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c5996b_1216_4f9c_bc1f_0ca06f8de088.slice/crio-b17ec4daf063c5703b00256072455de7f5b16699094429ac601ee68a37701dc1 WatchSource:0}: Error finding container b17ec4daf063c5703b00256072455de7f5b16699094429ac601ee68a37701dc1: Status 404 returned error can't find the container with id b17ec4daf063c5703b00256072455de7f5b16699094429ac601ee68a37701dc1 Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.129592 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.130186 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.130583 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:34 crc kubenswrapper[4717]: set -uo pipefail Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 05:27:34 crc kubenswrapper[4717]: HOSTS_FILE="/etc/hosts" Mar 08 05:27:34 crc kubenswrapper[4717]: TEMP_FILE="/etc/hosts.tmp" Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: # Make a temporary file with the old hosts file's attributes. Mar 08 05:27:34 crc kubenswrapper[4717]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 05:27:34 crc kubenswrapper[4717]: echo "Failed to preserve hosts file. Exiting." Mar 08 05:27:34 crc kubenswrapper[4717]: exit 1 Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: while true; do Mar 08 05:27:34 crc kubenswrapper[4717]: declare -A svc_ips Mar 08 05:27:34 crc kubenswrapper[4717]: for svc in "${services[@]}"; do Mar 08 05:27:34 crc kubenswrapper[4717]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 05:27:34 crc kubenswrapper[4717]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 05:27:34 crc kubenswrapper[4717]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 05:27:34 crc kubenswrapper[4717]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 05:27:34 crc kubenswrapper[4717]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:34 crc kubenswrapper[4717]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:34 crc kubenswrapper[4717]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:34 crc kubenswrapper[4717]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 05:27:34 crc kubenswrapper[4717]: for i in ${!cmds[*]} Mar 08 05:27:34 crc kubenswrapper[4717]: do Mar 08 05:27:34 crc kubenswrapper[4717]: ips=($(eval "${cmds[i]}")) Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: svc_ips["${svc}"]="${ips[@]}" Mar 08 05:27:34 crc kubenswrapper[4717]: break Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: # Update /etc/hosts only if we get valid service IPs Mar 08 05:27:34 crc kubenswrapper[4717]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 05:27:34 crc kubenswrapper[4717]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 05:27:34 crc kubenswrapper[4717]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 05:27:34 crc kubenswrapper[4717]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 05:27:34 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:34 crc kubenswrapper[4717]: continue Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: # Append resolver entries for services Mar 08 05:27:34 crc kubenswrapper[4717]: rc=0 Mar 08 05:27:34 crc kubenswrapper[4717]: for svc in "${!svc_ips[@]}"; do Mar 08 05:27:34 crc kubenswrapper[4717]: for ip in ${svc_ips[${svc}]}; do Mar 08 05:27:34 crc kubenswrapper[4717]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ $rc -ne 0 ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:34 crc kubenswrapper[4717]: continue Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 05:27:34 crc kubenswrapper[4717]: # Replace /etc/hosts with our modified version if needed Mar 08 05:27:34 crc kubenswrapper[4717]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 05:27:34 crc kubenswrapper[4717]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:34 crc kubenswrapper[4717]: unset svc_ips Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzbcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qhwzg_openshift-dns(5a6f4d53-3a88-4caa-b66c-3254cd82186b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.132492 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.132894 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qhwzg" podUID="5a6f4d53-3a88-4caa-b66c-3254cd82186b" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.133788 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 05:27:34 crc kubenswrapper[4717]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 05:27:34 crc kubenswrapper[4717]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgxsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d6f7j_openshift-multus(95c5996b-1216-4f9c-bc1f-0ca06f8de088): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.135345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d6f7j" podUID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.141762 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-069a044677c14b9d84b60b7b352d68dc390219807c99a6fee0c6aa26b2832032 WatchSource:0}: Error finding container 069a044677c14b9d84b60b7b352d68dc390219807c99a6fee0c6aa26b2832032: Status 404 returned error can't find the container with id 069a044677c14b9d84b60b7b352d68dc390219807c99a6fee0c6aa26b2832032 Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.143545 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.147724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.147780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.147804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.147833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.147855 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.149516 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:34 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 05:27:34 crc kubenswrapper[4717]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 05:27:34 crc kubenswrapper[4717]: ho_enable="--enable-hybrid-overlay" Mar 08 05:27:34 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 05:27:34 crc kubenswrapper[4717]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 05:27:34 crc kubenswrapper[4717]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 05:27:34 crc kubenswrapper[4717]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --webhook-host=127.0.0.1 \ Mar 08 05:27:34 crc kubenswrapper[4717]: --webhook-port=9743 \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${ho_enable} \ Mar 08 05:27:34 crc kubenswrapper[4717]: --enable-interconnect \ Mar 08 05:27:34 crc kubenswrapper[4717]: --disable-approver \ Mar 08 05:27:34 crc kubenswrapper[4717]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --wait-for-kubernetes-api=200s \ Mar 08 05:27:34 crc kubenswrapper[4717]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --loglevel="${LOGLEVEL}" Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.153477 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.153653 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e996d1c_6f08_4f2d_a64b_e6f58300117d.slice/crio-7c327fa7b973249b8c50b6200e24c8c619f91b7ecad57a75313397bae29b1895 WatchSource:0}: Error finding container 7c327fa7b973249b8c50b6200e24c8c619f91b7ecad57a75313397bae29b1895: Status 404 returned error can't find the container with id 7c327fa7b973249b8c50b6200e24c8c619f91b7ecad57a75313397bae29b1895 Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.155677 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c6317f_efb5_4d91_b5df_c56e975f7c1c.slice/crio-33663a72b6de6b7d9a7a59efe0b4e1fdd9b41bce749ca8c91df00382273b146f WatchSource:0}: Error finding container 33663a72b6de6b7d9a7a59efe0b4e1fdd9b41bce749ca8c91df00382273b146f: Status 404 returned error can't find the container with id 33663a72b6de6b7d9a7a59efe0b4e1fdd9b41bce749ca8c91df00382273b146f Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.156861 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:34 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 05:27:34 crc kubenswrapper[4717]: --disable-webhook \ Mar 08 05:27:34 crc kubenswrapper[4717]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --loglevel="${LOGLEVEL}" Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.157115 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.157970 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.158728 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 05:27:34 crc kubenswrapper[4717]: while [ true ]; Mar 08 05:27:34 crc kubenswrapper[4717]: do Mar 08 05:27:34 crc kubenswrapper[4717]: for f in $(ls /tmp/serviceca); do Mar 08 05:27:34 crc kubenswrapper[4717]: echo $f Mar 08 05:27:34 crc kubenswrapper[4717]: ca_file_path="/tmp/serviceca/${f}" Mar 08 05:27:34 crc kubenswrapper[4717]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 05:27:34 crc kubenswrapper[4717]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 05:27:34 crc kubenswrapper[4717]: if [ -e "${reg_dir_path}" ]; then Mar 08 05:27:34 crc kubenswrapper[4717]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 05:27:34 crc kubenswrapper[4717]: else Mar 08 05:27:34 crc kubenswrapper[4717]: mkdir $reg_dir_path Mar 08 05:27:34 crc kubenswrapper[4717]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: for d in $(ls /etc/docker/certs.d); do Mar 08 05:27:34 crc kubenswrapper[4717]: echo $d Mar 08 05:27:34 crc kubenswrapper[4717]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 05:27:34 crc kubenswrapper[4717]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 05:27:34 crc kubenswrapper[4717]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 05:27:34 crc kubenswrapper[4717]: rm -rf /etc/docker/certs.d/$d Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: sleep 60 & wait ${!} Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r82qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-6j4jn_openshift-image-registry(9e996d1c-6f08-4f2d-a64b-e6f58300117d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.162032 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-6j4jn" podUID="9e996d1c-6f08-4f2d-a64b-e6f58300117d" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.165093 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddlf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-pkcrh_openshift-multus(a5c6317f-efb5-4d91-b5df-c56e975f7c1c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.165960 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwsgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.166248 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" podUID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.168632 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwsgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.170292 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.170863 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: W0308 05:27:34.176916 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5508bbd_d773_4b40_a641_e538e619bc1b.slice/crio-618ed0daad9da2626773e0dc4e6956460795b7b9ed480605bb5b828316932d49 WatchSource:0}: Error finding container 618ed0daad9da2626773e0dc4e6956460795b7b9ed480605bb5b828316932d49: Status 404 returned error can't find the container with id 618ed0daad9da2626773e0dc4e6956460795b7b9ed480605bb5b828316932d49 Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.178862 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:34 crc kubenswrapper[4717]: set -euo pipefail Mar 08 05:27:34 crc kubenswrapper[4717]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 05:27:34 crc kubenswrapper[4717]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 05:27:34 crc kubenswrapper[4717]: # As the secret mount is optional we must wait for the files to be present. Mar 08 05:27:34 crc kubenswrapper[4717]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 05:27:34 crc kubenswrapper[4717]: TS=$(date +%s) Mar 08 05:27:34 crc kubenswrapper[4717]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 05:27:34 crc kubenswrapper[4717]: HAS_LOGGED_INFO=0 Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: log_missing_certs(){ Mar 08 05:27:34 crc kubenswrapper[4717]: CUR_TS=$(date +%s) Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 05:27:34 crc kubenswrapper[4717]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 05:27:34 crc kubenswrapper[4717]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 05:27:34 crc kubenswrapper[4717]: HAS_LOGGED_INFO=1 Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: } Mar 08 05:27:34 crc kubenswrapper[4717]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 05:27:34 crc kubenswrapper[4717]: log_missing_certs Mar 08 05:27:34 crc kubenswrapper[4717]: sleep 5 Mar 08 05:27:34 crc kubenswrapper[4717]: done Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/kube-rbac-proxy \ Mar 08 05:27:34 crc kubenswrapper[4717]: --logtostderr \ Mar 08 05:27:34 crc kubenswrapper[4717]: --secure-listen-address=:9108 \ Mar 08 05:27:34 crc kubenswrapper[4717]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 05:27:34 crc kubenswrapper[4717]: --upstream=http://127.0.0.1:29108/ \ Mar 08 05:27:34 crc kubenswrapper[4717]: --tls-private-key-file=${TLS_PK} \ Mar 08 05:27:34 crc kubenswrapper[4717]: --tls-cert-file=${TLS_CERT} Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvft8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q67qb_openshift-ovn-kubernetes(a5508bbd-d773-4b40-a641-e538e619bc1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.180616 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.181373 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:34 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v4_join_subnet_opt= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v6_join_subnet_opt= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v4_transit_switch_subnet_opt= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v6_transit_switch_subnet_opt= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: dns_name_resolver_enabled_flag= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "false" == "true" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: persistent_ips_enabled_flag= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "true" == "true" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: # This is needed so that converting clusters from GA to TP Mar 08 05:27:34 crc kubenswrapper[4717]: # will rollout control plane pods as well Mar 08 05:27:34 crc kubenswrapper[4717]: network_segmentation_enabled_flag= Mar 08 05:27:34 crc kubenswrapper[4717]: multi_network_enabled_flag= Mar 08 05:27:34 crc kubenswrapper[4717]: if [[ "true" == "true" ]]; then Mar 08 05:27:34 crc kubenswrapper[4717]: multi_network_enabled_flag="--enable-multi-network" Mar 08 05:27:34 crc kubenswrapper[4717]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 05:27:34 crc kubenswrapper[4717]: fi Mar 08 05:27:34 crc kubenswrapper[4717]: Mar 08 05:27:34 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 05:27:34 crc kubenswrapper[4717]: exec /usr/bin/ovnkube \ Mar 08 05:27:34 crc kubenswrapper[4717]: --enable-interconnect \ Mar 08 05:27:34 crc kubenswrapper[4717]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 05:27:34 crc kubenswrapper[4717]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 05:27:34 crc kubenswrapper[4717]: --metrics-enable-pprof \ Mar 08 05:27:34 crc kubenswrapper[4717]: --metrics-enable-config-duration \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${ovn_v4_join_subnet_opt} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${ovn_v6_join_subnet_opt} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${dns_name_resolver_enabled_flag} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${persistent_ips_enabled_flag} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${multi_network_enabled_flag} \ Mar 08 05:27:34 crc kubenswrapper[4717]: ${network_segmentation_enabled_flag} Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvft8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q67qb_openshift-ovn-kubernetes(a5508bbd-d773-4b40-a641-e538e619bc1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.182594 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" podUID="a5508bbd-d773-4b40-a641-e538e619bc1b" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.182752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.193070 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.202274 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.208865 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:34 crc kubenswrapper[4717]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 05:27:34 crc kubenswrapper[4717]: apiVersion: v1 Mar 08 05:27:34 crc kubenswrapper[4717]: clusters: Mar 08 05:27:34 crc kubenswrapper[4717]: - cluster: Mar 08 05:27:34 crc kubenswrapper[4717]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 05:27:34 crc kubenswrapper[4717]: server: https://api-int.crc.testing:6443 Mar 08 05:27:34 crc kubenswrapper[4717]: name: default-cluster Mar 08 05:27:34 crc kubenswrapper[4717]: contexts: Mar 08 05:27:34 crc kubenswrapper[4717]: - context: Mar 08 05:27:34 crc kubenswrapper[4717]: cluster: default-cluster Mar 08 05:27:34 crc kubenswrapper[4717]: namespace: default Mar 08 05:27:34 crc kubenswrapper[4717]: user: default-auth Mar 08 05:27:34 crc kubenswrapper[4717]: name: default-context Mar 08 05:27:34 crc kubenswrapper[4717]: current-context: default-context Mar 08 05:27:34 crc kubenswrapper[4717]: kind: Config Mar 08 05:27:34 crc kubenswrapper[4717]: preferences: {} Mar 08 05:27:34 crc kubenswrapper[4717]: users: Mar 08 05:27:34 crc kubenswrapper[4717]: - name: default-auth Mar 08 05:27:34 crc kubenswrapper[4717]: user: Mar 08 05:27:34 crc kubenswrapper[4717]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 05:27:34 crc kubenswrapper[4717]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 05:27:34 crc kubenswrapper[4717]: EOF Mar 08 05:27:34 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6btr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:34 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.210110 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.232360 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.254940 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.257907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.257954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.257966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.257989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.258003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.271103 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.286110 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.295235 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.303613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.313192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.356028 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.357309 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.357502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.357549 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.357498095 +0000 UTC m=+82.275146979 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.357622 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.357743 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.357723941 +0000 UTC m=+82.275372795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.357634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.357820 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.357897 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.357882265 +0000 UTC m=+82.275531139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.360829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.360868 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.360886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.360914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.360927 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.396459 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.435119 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.458898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.458998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459094 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459116 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459128 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459126 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459589 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459612 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459624 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459666 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.459648555 +0000 UTC m=+82.377297419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459706 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.459698446 +0000 UTC m=+82.377347300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.459721 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:35.459714597 +0000 UTC m=+82.377363451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.459503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.463509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.463546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.463558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.463575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.463589 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.473547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.512735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.566269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.566313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.566321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.566338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.566347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.668779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.668841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.668853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.668871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.668882 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.773054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.773106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.773174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.773220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.773241 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.798722 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.798963 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 05:27:34 crc kubenswrapper[4717]: E0308 05:27:34.799070 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.876400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.876462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.876480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.876507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.876527 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.979174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.979229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.979246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.979270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:34 crc kubenswrapper[4717]: I0308 05:27:34.979287 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:34Z","lastTransitionTime":"2026-03-08T05:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.082172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.082214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.082227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.082243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.082255 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.123186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerStarted","Data":"b17ec4daf063c5703b00256072455de7f5b16699094429ac601ee68a37701dc1"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.124901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6j4jn" event={"ID":"9e996d1c-6f08-4f2d-a64b-e6f58300117d","Type":"ContainerStarted","Data":"7c327fa7b973249b8c50b6200e24c8c619f91b7ecad57a75313397bae29b1895"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.125120 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 05:27:35 crc kubenswrapper[4717]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 05:27:35 crc kubenswrapper[4717]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgxsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d6f7j_openshift-multus(95c5996b-1216-4f9c-bc1f-0ca06f8de088): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.126340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"76add08484a3a8d64393522f61f226f44d1c19c40da5e2a47a62ab9136a28699"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.126369 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d6f7j" podUID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.126476 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 05:27:35 crc kubenswrapper[4717]: while [ true ]; Mar 08 05:27:35 crc kubenswrapper[4717]: do Mar 08 05:27:35 crc kubenswrapper[4717]: for f in $(ls /tmp/serviceca); do Mar 08 05:27:35 crc kubenswrapper[4717]: echo $f Mar 08 05:27:35 crc kubenswrapper[4717]: ca_file_path="/tmp/serviceca/${f}" Mar 08 05:27:35 crc kubenswrapper[4717]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 05:27:35 crc kubenswrapper[4717]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 05:27:35 crc kubenswrapper[4717]: if [ -e "${reg_dir_path}" ]; then Mar 08 05:27:35 crc kubenswrapper[4717]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 05:27:35 crc kubenswrapper[4717]: else Mar 08 05:27:35 crc kubenswrapper[4717]: mkdir $reg_dir_path Mar 08 05:27:35 crc kubenswrapper[4717]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: for d in $(ls /etc/docker/certs.d); do Mar 08 05:27:35 crc kubenswrapper[4717]: echo $d Mar 08 05:27:35 crc kubenswrapper[4717]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 05:27:35 crc kubenswrapper[4717]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 05:27:35 crc kubenswrapper[4717]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 05:27:35 crc kubenswrapper[4717]: rm -rf /etc/docker/certs.d/$d Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: sleep 60 & wait ${!} Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r82qg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-6j4jn_openshift-image-registry(9e996d1c-6f08-4f2d-a64b-e6f58300117d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.127569 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-6j4jn" podUID="9e996d1c-6f08-4f2d-a64b-e6f58300117d" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.127838 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 05:27:35 crc kubenswrapper[4717]: apiVersion: v1 Mar 08 05:27:35 crc kubenswrapper[4717]: clusters: Mar 08 05:27:35 crc kubenswrapper[4717]: - cluster: Mar 08 05:27:35 crc kubenswrapper[4717]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 05:27:35 crc kubenswrapper[4717]: server: https://api-int.crc.testing:6443 Mar 08 05:27:35 crc kubenswrapper[4717]: name: default-cluster Mar 08 05:27:35 crc kubenswrapper[4717]: contexts: Mar 08 05:27:35 crc kubenswrapper[4717]: - context: Mar 08 05:27:35 crc kubenswrapper[4717]: cluster: default-cluster Mar 08 05:27:35 crc kubenswrapper[4717]: namespace: default Mar 08 05:27:35 crc kubenswrapper[4717]: user: default-auth Mar 08 05:27:35 crc kubenswrapper[4717]: name: default-context Mar 08 05:27:35 crc kubenswrapper[4717]: current-context: default-context Mar 08 05:27:35 crc kubenswrapper[4717]: kind: Config Mar 08 05:27:35 crc kubenswrapper[4717]: preferences: {} Mar 08 05:27:35 crc kubenswrapper[4717]: users: Mar 08 05:27:35 crc kubenswrapper[4717]: - name: default-auth Mar 08 05:27:35 crc kubenswrapper[4717]: user: Mar 08 05:27:35 crc kubenswrapper[4717]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 05:27:35 crc kubenswrapper[4717]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 05:27:35 crc kubenswrapper[4717]: EOF Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6btr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.128183 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qhwzg" event={"ID":"5a6f4d53-3a88-4caa-b66c-3254cd82186b","Type":"ContainerStarted","Data":"7293277ceeec1869b205f563b0aa95966807bf4d98b2c52cceee0157bd566a03"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.128888 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.129493 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:35 crc kubenswrapper[4717]: set -uo pipefail Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 05:27:35 crc kubenswrapper[4717]: HOSTS_FILE="/etc/hosts" Mar 08 05:27:35 crc kubenswrapper[4717]: TEMP_FILE="/etc/hosts.tmp" Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: # Make a temporary file with the old hosts file's attributes. Mar 08 05:27:35 crc kubenswrapper[4717]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 05:27:35 crc kubenswrapper[4717]: echo "Failed to preserve hosts file. Exiting." Mar 08 05:27:35 crc kubenswrapper[4717]: exit 1 Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: while true; do Mar 08 05:27:35 crc kubenswrapper[4717]: declare -A svc_ips Mar 08 05:27:35 crc kubenswrapper[4717]: for svc in "${services[@]}"; do Mar 08 05:27:35 crc kubenswrapper[4717]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 05:27:35 crc kubenswrapper[4717]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 05:27:35 crc kubenswrapper[4717]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 05:27:35 crc kubenswrapper[4717]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 05:27:35 crc kubenswrapper[4717]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:35 crc kubenswrapper[4717]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:35 crc kubenswrapper[4717]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 05:27:35 crc kubenswrapper[4717]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 05:27:35 crc kubenswrapper[4717]: for i in ${!cmds[*]} Mar 08 05:27:35 crc kubenswrapper[4717]: do Mar 08 05:27:35 crc kubenswrapper[4717]: ips=($(eval "${cmds[i]}")) Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: svc_ips["${svc}"]="${ips[@]}" Mar 08 05:27:35 crc kubenswrapper[4717]: break Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: # Update /etc/hosts only if we get valid service IPs Mar 08 05:27:35 crc kubenswrapper[4717]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 05:27:35 crc kubenswrapper[4717]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 05:27:35 crc kubenswrapper[4717]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 05:27:35 crc kubenswrapper[4717]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 05:27:35 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:35 crc kubenswrapper[4717]: continue Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: # Append resolver entries for services Mar 08 05:27:35 crc kubenswrapper[4717]: rc=0 Mar 08 05:27:35 crc kubenswrapper[4717]: for svc in "${!svc_ips[@]}"; do Mar 08 05:27:35 crc kubenswrapper[4717]: for ip in ${svc_ips[${svc}]}; do Mar 08 05:27:35 crc kubenswrapper[4717]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ $rc -ne 0 ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:35 crc kubenswrapper[4717]: continue Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 05:27:35 crc kubenswrapper[4717]: # Replace /etc/hosts with our modified version if needed Mar 08 05:27:35 crc kubenswrapper[4717]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 05:27:35 crc kubenswrapper[4717]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: sleep 60 & wait Mar 08 05:27:35 crc kubenswrapper[4717]: unset svc_ips Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzbcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-qhwzg_openshift-dns(5a6f4d53-3a88-4caa-b66c-3254cd82186b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.129736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" event={"ID":"a5508bbd-d773-4b40-a641-e538e619bc1b","Type":"ContainerStarted","Data":"618ed0daad9da2626773e0dc4e6956460795b7b9ed480605bb5b828316932d49"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.130770 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-qhwzg" podUID="5a6f4d53-3a88-4caa-b66c-3254cd82186b" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.131105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"db78921f595c5a86d1b0caf62d2bb49999d322bc1c9cc313cec346d76dd2cc63"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.131115 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 05:27:35 crc kubenswrapper[4717]: set -euo pipefail Mar 08 05:27:35 crc kubenswrapper[4717]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 05:27:35 crc kubenswrapper[4717]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 05:27:35 crc kubenswrapper[4717]: # As the secret mount is optional we must wait for the files to be present. Mar 08 05:27:35 crc kubenswrapper[4717]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 05:27:35 crc kubenswrapper[4717]: TS=$(date +%s) Mar 08 05:27:35 crc kubenswrapper[4717]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 05:27:35 crc kubenswrapper[4717]: HAS_LOGGED_INFO=0 Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: log_missing_certs(){ Mar 08 05:27:35 crc kubenswrapper[4717]: CUR_TS=$(date +%s) Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 05:27:35 crc kubenswrapper[4717]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 05:27:35 crc kubenswrapper[4717]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 05:27:35 crc kubenswrapper[4717]: HAS_LOGGED_INFO=1 Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: } Mar 08 05:27:35 crc kubenswrapper[4717]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 05:27:35 crc kubenswrapper[4717]: log_missing_certs Mar 08 05:27:35 crc kubenswrapper[4717]: sleep 5 Mar 08 05:27:35 crc kubenswrapper[4717]: done Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 05:27:35 crc kubenswrapper[4717]: exec /usr/bin/kube-rbac-proxy \ Mar 08 05:27:35 crc kubenswrapper[4717]: --logtostderr \ Mar 08 05:27:35 crc kubenswrapper[4717]: --secure-listen-address=:9108 \ Mar 08 05:27:35 crc kubenswrapper[4717]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 05:27:35 crc kubenswrapper[4717]: --upstream=http://127.0.0.1:29108/ \ Mar 08 05:27:35 crc kubenswrapper[4717]: --tls-private-key-file=${TLS_PK} \ Mar 08 05:27:35 crc kubenswrapper[4717]: --tls-cert-file=${TLS_CERT} Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvft8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q67qb_openshift-ovn-kubernetes(a5508bbd-d773-4b40-a641-e538e619bc1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.132929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerStarted","Data":"33663a72b6de6b7d9a7a59efe0b4e1fdd9b41bce749ca8c91df00382273b146f"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.133019 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwsgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.133195 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:35 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v4_join_subnet_opt= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v6_join_subnet_opt= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v4_transit_switch_subnet_opt= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v6_transit_switch_subnet_opt= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "" != "" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: dns_name_resolver_enabled_flag= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "false" == "true" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: persistent_ips_enabled_flag= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "true" == "true" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: # This is needed so that converting clusters from GA to TP Mar 08 05:27:35 crc kubenswrapper[4717]: # will rollout control plane pods as well Mar 08 05:27:35 crc kubenswrapper[4717]: network_segmentation_enabled_flag= Mar 08 05:27:35 crc kubenswrapper[4717]: multi_network_enabled_flag= Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ "true" == "true" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: multi_network_enabled_flag="--enable-multi-network" Mar 08 05:27:35 crc kubenswrapper[4717]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 05:27:35 crc kubenswrapper[4717]: exec /usr/bin/ovnkube \ Mar 08 05:27:35 crc kubenswrapper[4717]: --enable-interconnect \ Mar 08 05:27:35 crc kubenswrapper[4717]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 05:27:35 crc kubenswrapper[4717]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --metrics-enable-pprof \ Mar 08 05:27:35 crc kubenswrapper[4717]: --metrics-enable-config-duration \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${ovn_v4_join_subnet_opt} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${ovn_v6_join_subnet_opt} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${dns_name_resolver_enabled_flag} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${persistent_ips_enabled_flag} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${multi_network_enabled_flag} \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${network_segmentation_enabled_flag} Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvft8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q67qb_openshift-ovn-kubernetes(a5508bbd-d773-4b40-a641-e538e619bc1b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.133912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"069a044677c14b9d84b60b7b352d68dc390219807c99a6fee0c6aa26b2832032"} Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.134433 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddlf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-pkcrh_openshift-multus(a5c6317f-efb5-4d91-b5df-c56e975f7c1c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.134565 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" podUID="a5508bbd-d773-4b40-a641-e538e619bc1b" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.135466 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.135556 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" podUID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.135868 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.136192 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:35 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 05:27:35 crc kubenswrapper[4717]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 05:27:35 crc kubenswrapper[4717]: ho_enable="--enable-hybrid-overlay" Mar 08 05:27:35 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 05:27:35 crc kubenswrapper[4717]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 05:27:35 crc kubenswrapper[4717]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 05:27:35 crc kubenswrapper[4717]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 05:27:35 crc kubenswrapper[4717]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --webhook-host=127.0.0.1 \ Mar 08 05:27:35 crc kubenswrapper[4717]: --webhook-port=9743 \ Mar 08 05:27:35 crc kubenswrapper[4717]: ${ho_enable} \ Mar 08 05:27:35 crc kubenswrapper[4717]: --enable-interconnect \ Mar 08 05:27:35 crc kubenswrapper[4717]: --disable-approver \ Mar 08 05:27:35 crc kubenswrapper[4717]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --wait-for-kubernetes-api=200s \ Mar 08 05:27:35 crc kubenswrapper[4717]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --loglevel="${LOGLEVEL}" Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.136159 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwsgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.137803 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.139610 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:27:35 crc kubenswrapper[4717]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 05:27:35 crc kubenswrapper[4717]: if [[ -f "/env/_master" ]]; then Mar 08 05:27:35 crc kubenswrapper[4717]: set -o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: source "/env/_master" Mar 08 05:27:35 crc kubenswrapper[4717]: set +o allexport Mar 08 05:27:35 crc kubenswrapper[4717]: fi Mar 08 05:27:35 crc kubenswrapper[4717]: Mar 08 05:27:35 crc kubenswrapper[4717]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 05:27:35 crc kubenswrapper[4717]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 05:27:35 crc kubenswrapper[4717]: --disable-webhook \ Mar 08 05:27:35 crc kubenswrapper[4717]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 05:27:35 crc kubenswrapper[4717]: --loglevel="${LOGLEVEL}" Mar 08 05:27:35 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 05:27:35 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.140898 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.143215 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.154602 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.184711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.184779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.184796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.184822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.184843 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.189549 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.211480 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.222751 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.239967 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.251858 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.268132 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.286227 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.288599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.288670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.288718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.288744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.288765 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.298293 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.312775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.324415 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.341547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.354847 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.366637 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.371672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.371802 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.371823 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.371957 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.372001 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.371963989 +0000 UTC m=+84.289612873 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.372049 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.372034261 +0000 UTC m=+84.289683145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.372256 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.372336 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.372315148 +0000 UTC m=+84.289964032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.383731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.391495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.391550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.391569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.391596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.391616 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.399647 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.413508 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.426441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.439719 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.450398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.460152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.469044 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.473236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.473300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.473343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473436 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473461 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473513 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473537 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473468 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473626 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473648 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473492 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.473477803 +0000 UTC m=+84.391126647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473751 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.47372095 +0000 UTC m=+84.391369814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.473772 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:37.473762621 +0000 UTC m=+84.391411485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.494166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.494207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.494218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.494235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.494247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.495483 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.533470 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.577129 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.597400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.597457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.597494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.597542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.597617 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.621634 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.658556 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.695668 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.700599 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.700649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.700662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.700698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.700714 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.750280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.781501 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.781569 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.781598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.781522 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.781660 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.781872 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.782007 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:35 crc kubenswrapper[4717]: E0308 05:27:35.782131 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.791376 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.792586 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.795157 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.796550 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.798844 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.800055 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.801398 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.802553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.802619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.802638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.802670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.802716 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.803296 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.804776 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.806640 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.807888 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.810133 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.811214 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.812299 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.814154 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.814934 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.816718 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.817253 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.818044 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.819453 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.820090 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.821425 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.822034 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.823388 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.823951 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.824811 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.826194 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.826822 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.828087 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.828740 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.829894 4717 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.830027 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.832314 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.833555 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.834203 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.836135 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.837027 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.838217 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.839146 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.840505 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.841446 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.842835 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.843638 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.845031 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.845650 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.847003 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.847656 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.849207 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.849835 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.851286 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.852426 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.854396 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.856028 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.858001 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.907901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.908002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.908015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.908035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:35 crc kubenswrapper[4717]: I0308 05:27:35.908048 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:35Z","lastTransitionTime":"2026-03-08T05:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.010983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.011031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.011040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.011061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.011075 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.114597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.114660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.114677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.114730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.114753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.217665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.217766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.217793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.217825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.217846 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.321475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.321540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.321559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.321583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.321601 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.425125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.425196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.425215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.425245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.425267 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.528979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.529046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.529066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.529093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.529112 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.632547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.632625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.632638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.632656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.632668 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.736482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.736529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.736543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.736568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.736584 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.839291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.839340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.839351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.839370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.839381 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.941858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.941911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.941923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.941940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:36 crc kubenswrapper[4717]: I0308 05:27:36.941953 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:36Z","lastTransitionTime":"2026-03-08T05:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.045405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.045456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.045468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.045491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.045504 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.076156 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.148232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.148316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.148354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.148395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.148420 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.252194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.252258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.252276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.252302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.252322 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.356168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.356231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.356253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.356279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.356302 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.395347 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.395603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.395720 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.395659024 +0000 UTC m=+88.313307908 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.395792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.395828 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.395985 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.395951171 +0000 UTC m=+88.313600045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.396167 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.396332 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.39629515 +0000 UTC m=+88.313944024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.459804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.460276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.460405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.460605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.460764 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.496763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.497073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497209 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497282 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497301 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497355 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497391 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.497357192 +0000 UTC m=+88.415006066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497488 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.497444315 +0000 UTC m=+88.415093389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497811 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497852 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497875 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.497970 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:41.497940187 +0000 UTC m=+88.415589061 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.498212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.564094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.564164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.564183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.564213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.564232 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.667033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.667132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.667188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.667217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.667269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.771364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.771436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.771450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.771485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.771510 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.780968 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.781029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.780978 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.780968 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.781155 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.781462 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.781605 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:37 crc kubenswrapper[4717]: E0308 05:27:37.781864 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.875652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.875746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.875766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.875794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.875818 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.978620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.978730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.978753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.978779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:37 crc kubenswrapper[4717]: I0308 05:27:37.978842 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:37Z","lastTransitionTime":"2026-03-08T05:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.083013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.083099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.083129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.083165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.083188 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.186013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.186082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.186101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.186128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.186147 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.289838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.289877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.289887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.289901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.289911 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.393881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.393966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.393977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.393997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.394010 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.496865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.496935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.496955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.496984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.497003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.599673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.599764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.599784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.599811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.599835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.702947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.703027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.703063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.703095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.703121 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.806523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.806595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.806614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.806645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.806664 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.909574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.909640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.909659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.909711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:38 crc kubenswrapper[4717]: I0308 05:27:38.909730 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:38Z","lastTransitionTime":"2026-03-08T05:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.013448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.013524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.013549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.013584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.013607 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.116653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.116749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.116769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.116799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.116816 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.220390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.220786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.220953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.221125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.221365 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.323931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.324007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.324018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.324068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.324081 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.349887 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.441844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.441880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.441889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.441908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.441918 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.544461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.544508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.544525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.544551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.544570 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.647244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.647302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.647321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.647351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.647370 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.751573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.751632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.751648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.751671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.751715 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.781641 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.781769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.781645 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.781932 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:39 crc kubenswrapper[4717]: E0308 05:27:39.782032 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:39 crc kubenswrapper[4717]: E0308 05:27:39.782231 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:39 crc kubenswrapper[4717]: E0308 05:27:39.782433 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:39 crc kubenswrapper[4717]: E0308 05:27:39.782676 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.854634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.854730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.854753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.854777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.854794 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.957432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.957502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.957527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.957558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:39 crc kubenswrapper[4717]: I0308 05:27:39.957575 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:39Z","lastTransitionTime":"2026-03-08T05:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.061013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.061326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.061413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.061517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.061610 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.164089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.164409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.164480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.164544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.164603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.268589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.268639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.268656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.268700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.268715 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.371549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.371593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.371602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.371616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.371626 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.474436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.474490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.474502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.474521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.474533 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.577139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.577185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.577196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.577213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.577225 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.679675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.679763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.679781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.679805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.679820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.782975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.783266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.783340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.783407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.783468 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.853244 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.886595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.886650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.886668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.886722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.886741 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.990137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.990202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.990221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.990253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:40 crc kubenswrapper[4717]: I0308 05:27:40.990278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:40Z","lastTransitionTime":"2026-03-08T05:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.092255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.092290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.092300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.092317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.092328 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.194973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.195350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.195499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.195634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.195809 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.298594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.298710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.298732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.298759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.298777 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.402067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.402129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.402146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.402172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.402192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.463439 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.463674 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.463636716 +0000 UTC m=+96.381285600 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.463904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.463959 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.464124 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.464133 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.464218 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.464202981 +0000 UTC m=+96.381851865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.464249 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.464235841 +0000 UTC m=+96.381884715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.505064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.505129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.505144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.505167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.505182 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.565515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.565602 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.565722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565827 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565869 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565885 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565904 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565923 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565937 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565961 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.565974 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.565946201 +0000 UTC m=+96.483595255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.566022 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.565986612 +0000 UTC m=+96.483635496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.566066 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:27:49.566044013 +0000 UTC m=+96.483692887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.608214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.608262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.608275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.608302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.608318 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.711422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.711543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.711565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.711600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.711622 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.780909 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.780964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.780946 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.780913 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.781211 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.781373 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.781509 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:41 crc kubenswrapper[4717]: E0308 05:27:41.781626 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.815253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.815432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.815458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.815488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.815536 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.918857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.918925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.918943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.918973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:41 crc kubenswrapper[4717]: I0308 05:27:41.918992 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:41Z","lastTransitionTime":"2026-03-08T05:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.022714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.022789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.022809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.022836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.022856 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.126476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.126531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.126548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.126575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.126597 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.229752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.229798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.229814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.229836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.229856 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.332725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.332771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.332789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.332809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.332826 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.435811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.435858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.435867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.435884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.435894 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.538552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.538600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.538620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.538641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.538658 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.642060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.642167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.642188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.642219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.642236 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.745566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.745642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.745662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.745746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.745784 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.848348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.848383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.848395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.848409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.848420 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.951210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.951714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.951872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.952060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:42 crc kubenswrapper[4717]: I0308 05:27:42.952195 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:42Z","lastTransitionTime":"2026-03-08T05:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.054824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.055556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.055772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.055926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.056101 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.160573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.160641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.160660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.160718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.160740 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.263348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.263419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.263436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.263464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.263485 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.366123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.366161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.366172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.366189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.366200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.469172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.469251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.469272 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.469313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.469337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.572562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.572653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.572672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.572736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.572758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.675604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.675918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.676020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.676128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.676226 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.713500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.713712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.713808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.713912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.713999 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.732791 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.739077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.739215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.739317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.739436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.739519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.755126 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.760596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.760665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.760715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.760754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.760782 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.775925 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.780937 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.780942 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.781084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.781305 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.781506 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.781798 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.781976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.782020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.782041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.782017 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.782066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.782130 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.782152 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.792251 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.796075 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.799670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.799724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.799741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.799768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.799783 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.814423 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: E0308 05:27:43.814730 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.814889 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.817520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.817583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.817604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.817634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.817656 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.826759 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.845089 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.862629 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.879041 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.895150 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.905871 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.921369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.921420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.921431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.921452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.921465 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:43Z","lastTransitionTime":"2026-03-08T05:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.926101 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.941024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.953198 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.964054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.975498 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:43 crc kubenswrapper[4717]: I0308 05:27:43.989172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.000218 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.009409 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.024935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.025001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.025022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.025049 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.025071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.129149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.129215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.129236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.129267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.129286 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.232734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.232810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.232830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.232858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.232878 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.335547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.335615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.335633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.335663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.335714 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.439179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.439262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.439288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.439320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.439337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.542916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.542995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.543018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.543045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.543067 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.646600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.646670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.646712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.646742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.646772 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.750797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.750843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.750857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.750880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.750895 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.854220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.854271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.854283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.854307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.854319 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.956535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.956579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.956603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.956622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:44 crc kubenswrapper[4717]: I0308 05:27:44.956631 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:44Z","lastTransitionTime":"2026-03-08T05:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.058997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.059032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.059043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.059060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.059074 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.161929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.161969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.161979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.162031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.162043 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.265170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.265229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.265247 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.265271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.265290 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.368352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.368413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.368468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.368496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.368554 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.471528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.471581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.471600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.471625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.471643 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.573924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.573964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.573974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.573988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.573998 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.676709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.676764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.676777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.676799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.676815 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.779037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.779110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.779133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.779162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.779184 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.780623 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.780799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.780841 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:45 crc kubenswrapper[4717]: E0308 05:27:45.780940 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.780977 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:45 crc kubenswrapper[4717]: E0308 05:27:45.782280 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:45 crc kubenswrapper[4717]: E0308 05:27:45.782387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:45 crc kubenswrapper[4717]: E0308 05:27:45.782501 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.882088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.882127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.882139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.882155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.882167 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.985365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.985404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.985412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.985428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:45 crc kubenswrapper[4717]: I0308 05:27:45.985437 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:45Z","lastTransitionTime":"2026-03-08T05:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.088509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.089089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.089379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.089406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.089607 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.170768 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6j4jn" event={"ID":"9e996d1c-6f08-4f2d-a64b-e6f58300117d","Type":"ContainerStarted","Data":"2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.172491 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" exitCode=0 Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.172536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.186276 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.193559 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.194419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.194457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.194467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.194484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.194496 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.210918 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.223466 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.232139 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.244532 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.252284 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.262543 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.286301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.296927 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.300266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.300306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.300315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.300329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.300338 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.306912 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.321348 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.334003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.343820 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.349715 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.356963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.362575 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.370342 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.380920 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.389564 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.399923 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.403245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.403269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.403278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.403292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.403301 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.416043 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.425583 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.432238 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.442846 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.451494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.467829 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.477328 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.485592 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.492253 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.501275 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.504917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.504960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.504971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.504987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.504998 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.513294 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.607431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.607652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.607764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.607851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.607931 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.710465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.710513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.710530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.710552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.710568 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.812899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.812945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.812962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.812985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.813001 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.919172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.919213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.919226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.919249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:46 crc kubenswrapper[4717]: I0308 05:27:46.919261 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:46Z","lastTransitionTime":"2026-03-08T05:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.022396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.022446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.022461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.022480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.022493 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.125160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.125194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.125202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.125216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.125225 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182345 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182378 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182396 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182408 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.182418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.183642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qhwzg" event={"ID":"5a6f4d53-3a88-4caa-b66c-3254cd82186b","Type":"ContainerStarted","Data":"8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.184760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.192177 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.200172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.211657 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.220089 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.227312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.227340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.227349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.227363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.227371 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.228497 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.243481 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.253835 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.263765 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.274045 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.282598 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.296291 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.319267 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.327869 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.329709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.329737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.329749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.329765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.329777 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.335466 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.343328 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.350657 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.356573 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.362906 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.370781 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.384559 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.393823 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.408073 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.420356 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.427880 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.431557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.431610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.431628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.431651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.431668 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.444861 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.456655 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.471595 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.486884 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.512715 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.524185 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.533673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.533711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.533719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.533733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.533744 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.539091 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.557268 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.636362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.636414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.636433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.636461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.636474 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.739412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.739465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.739487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.739517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.739540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.781785 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.781957 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:47 crc kubenswrapper[4717]: E0308 05:27:47.782031 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.782182 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:47 crc kubenswrapper[4717]: E0308 05:27:47.782223 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.782301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:47 crc kubenswrapper[4717]: E0308 05:27:47.782830 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:47 crc kubenswrapper[4717]: E0308 05:27:47.782945 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.842454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.842498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.842509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.842525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.842535 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.947964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.948031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.948055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.948080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:47 crc kubenswrapper[4717]: I0308 05:27:47.948097 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:47Z","lastTransitionTime":"2026-03-08T05:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.055028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.055382 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.055392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.055407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.055420 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.157526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.157561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.157572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.157586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.157596 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.189730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.189818 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.191717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.191775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.200665 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.208266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.218258 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.239437 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.260018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.260059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.260067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.260081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.260092 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.261902 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.289871 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.321732 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.349327 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.361636 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.363105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.363143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.363153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.363171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.363185 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.374714 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.385314 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.394962 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.405493 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.414257 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.425361 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.441026 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.448838 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.456754 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.466477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.466518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.466528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.466544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.466553 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.471296 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.480951 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.495190 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.511169 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.527300 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.537226 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.562716 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.568738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.568778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.568789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.568808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.568822 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.580547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.596895 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.612592 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.631424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.656645 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.670706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.670751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.670762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.670777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.670787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.680780 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.696385 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:48Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.772667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.772756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.772776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.772802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.772820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.876293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.876348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.876359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.876380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.876392 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.981613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.981670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.981706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.981734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:48 crc kubenswrapper[4717]: I0308 05:27:48.981755 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:48Z","lastTransitionTime":"2026-03-08T05:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.084835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.085212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.085222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.085236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.085245 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.187426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.187468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.187482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.187501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.187514 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.290795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.290837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.290846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.290862 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.290872 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.396056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.396092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.396102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.396115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.396124 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.464280 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.464470 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.464440115 +0000 UTC m=+112.382088969 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.464575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.464640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.464734 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.464792 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.464781734 +0000 UTC m=+112.382430588 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.464909 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.465028 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.464999179 +0000 UTC m=+112.382648083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.499428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.499495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.499514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.499538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.499555 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.602369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.602413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.602425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.602442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.602457 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.666196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.666259 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.666298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666421 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666468 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.666451928 +0000 UTC m=+112.584100772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666546 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666564 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666577 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666607 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.666597521 +0000 UTC m=+112.584246365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666668 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666697 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666707 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.666741 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:05.666725665 +0000 UTC m=+112.584374509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.716029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.716084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.716102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.716123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.716141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.781340 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.781760 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.781898 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.782139 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.782239 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.782486 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.782566 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.782771 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.783386 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:27:49 crc kubenswrapper[4717]: E0308 05:27:49.783743 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.814647 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.818791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.818826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.818841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.818860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.818872 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.921107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.921177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.921194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.921219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:49 crc kubenswrapper[4717]: I0308 05:27:49.921236 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:49Z","lastTransitionTime":"2026-03-08T05:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.024513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.024554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.024565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.024581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.024593 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.127591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.128102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.128120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.128145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.128163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.198385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" event={"ID":"a5508bbd-d773-4b40-a641-e538e619bc1b","Type":"ContainerStarted","Data":"44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.198439 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" event={"ID":"a5508bbd-d773-4b40-a641-e538e619bc1b","Type":"ContainerStarted","Data":"cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.200388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerStarted","Data":"ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.203979 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.205964 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467" exitCode=0 Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.206031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.218094 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.231673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.231817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.231841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.231866 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.231886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.236469 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.257547 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.275265 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.288369 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.306485 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.322193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.334749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.334778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.334787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.334800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.334810 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.335389 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.353441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.365289 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.390172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.406272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.423084 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.434423 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.437000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.437034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.437045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.437062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.437074 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.448106 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.463141 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.476101 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.488056 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.500424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.517160 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.528710 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.539866 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.539954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.539966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.539980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.539988 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.541005 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.550946 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.562305 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.574118 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.586417 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.597050 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.614576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.627792 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.637851 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.642084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.642128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.642140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.642161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.642174 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.653150 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.665372 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:50Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.744736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.744772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.744779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.744794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.744804 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.848167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.848204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.848399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.848414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.848431 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.950321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.950353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.950362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.950375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:50 crc kubenswrapper[4717]: I0308 05:27:50.950384 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:50Z","lastTransitionTime":"2026-03-08T05:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.053093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.053169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.053186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.053234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.053247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.156601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.156649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.156658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.156674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.156707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.213032 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.216709 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8" exitCode=0 Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.216774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.230539 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.262053 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.270926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.270975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.270990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.271011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.271026 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.280030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.296012 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.332254 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.351313 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.361910 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.372494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.373402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.373430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.373442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.373467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.373492 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.391109 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.407343 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.425133 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.440172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.453353 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.466891 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.478829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.478869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.478884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.478903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.478914 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.480078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.499147 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.510825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.527793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.540170 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.552793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.570009 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.581510 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.593655 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.608848 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.631378 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.646567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.657694 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.670569 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.683835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.683911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.683929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.683985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.684006 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.686034 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.708967 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.720416 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.731974 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:51Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.783566 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:51 crc kubenswrapper[4717]: E0308 05:27:51.783701 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.784178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.784261 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:51 crc kubenswrapper[4717]: E0308 05:27:51.784301 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:51 crc kubenswrapper[4717]: E0308 05:27:51.784634 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.784816 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:51 crc kubenswrapper[4717]: E0308 05:27:51.784975 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.810636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.810674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.810710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.810730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.810743 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.913258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.913597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.913718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.913822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:51 crc kubenswrapper[4717]: I0308 05:27:51.913838 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:51Z","lastTransitionTime":"2026-03-08T05:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.015941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.015967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.015975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.015988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.015997 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.118766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.118796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.118804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.118817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.118826 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.221601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.221644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.221661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.221710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.221729 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.225346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.225714 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.225741 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.225754 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.242168 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0" exitCode=0 Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.242229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.246313 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.285722 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.298041 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.314408 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.322706 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.324248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.324280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.324292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.324308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.324319 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.341193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.355759 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.373134 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.384329 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.395078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.405703 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.422026 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.427211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.427258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.427268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.427282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.427291 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.445190 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.461545 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.476121 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.487456 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.500069 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.510830 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.523802 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.529617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.529654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.529666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.529697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.529707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.535408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.547093 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.558872 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.580856 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.593896 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.602773 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.611438 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.629026 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.631909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.631939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.631947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.631961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.631970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.641093 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.653243 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.664648 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.675354 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.685861 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.695196 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.707024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:52Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.733958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.734255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.734270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.734287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.734298 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.837058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.837142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.837162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.837187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.837204 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.939784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.939840 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.939857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.939881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:52 crc kubenswrapper[4717]: I0308 05:27:52.939898 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:52Z","lastTransitionTime":"2026-03-08T05:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.043150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.043229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.043246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.043270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.043289 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.146885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.146943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.146959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.146984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.147003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.249513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.249628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.249659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.249725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.249753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.251804 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c" exitCode=0 Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.251855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.277395 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.300980 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.318822 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.342542 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.353071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.353127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.353150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.353179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.353201 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.357985 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.370147 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.388582 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.404803 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.426358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.445398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.459450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.459498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.459513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.459533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.459548 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.466668 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.495878 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.512225 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.533278 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.553709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.561585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.561611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.561619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.561633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.561642 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.568374 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.664509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.664575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.664591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.664634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.664650 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.767645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.767721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.767735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.767755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.767768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.781312 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:53 crc kubenswrapper[4717]: E0308 05:27:53.781431 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.781519 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.781545 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.781533 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:53 crc kubenswrapper[4717]: E0308 05:27:53.781729 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:53 crc kubenswrapper[4717]: E0308 05:27:53.781837 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:53 crc kubenswrapper[4717]: E0308 05:27:53.781895 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.798289 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.809369 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.819763 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.831907 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.842927 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.853483 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.864088 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.872078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.872120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.872137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.872160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.872176 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.877365 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.896431 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.910003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.921398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.930299 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.957751 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977221 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:53Z","lastTransitionTime":"2026-03-08T05:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.977936 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:53 crc kubenswrapper[4717]: I0308 05:27:53.995726 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.011673 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.081513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.081556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.081567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.081586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.081598 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.150285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.150865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.150888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.150917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.150942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.167339 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.171758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.171781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.171791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.171806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.171817 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.186322 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.190234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.190268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.190277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.190294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.190303 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.206504 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.212555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.212596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.212605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.212623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.212633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.226420 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.230064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.230091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.230099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.230110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.230119 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.240588 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: E0308 05:27:54.240867 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.242151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.242261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.242286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.242307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.242326 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.262148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.262740 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284" exitCode=0 Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.287022 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.304388 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.331239 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.344967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.345023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.345039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.345060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.345074 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.349130 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.363845 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.384165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.400898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.416122 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.440826 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.446978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.447002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.447011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.447024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.447033 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.455731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.471444 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.485793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.507075 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.520749 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.539453 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.548786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.548816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.548825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.548838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.548849 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.555633 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.651888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.651936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.651953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.651975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.652137 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.755985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.756030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.756040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.756056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.756069 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.861353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.861422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.861439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.861470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.861494 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.964999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.965054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.965072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.965096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:54 crc kubenswrapper[4717]: I0308 05:27:54.965116 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:54Z","lastTransitionTime":"2026-03-08T05:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.069031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.069098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.069117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.069145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.069165 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.172931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.172991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.173009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.173039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.173056 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.272662 4717 generic.go:334] "Generic (PLEG): container finished" podID="a5c6317f-efb5-4d91-b5df-c56e975f7c1c" containerID="0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de" exitCode=0 Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.272752 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerDied","Data":"0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.275932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.276191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.276226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.276280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.276308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.277879 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/0.log" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.287375 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e" exitCode=2 Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.287433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.288491 4717 scope.go:117] "RemoveContainer" containerID="8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.296565 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.314003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.334342 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.350744 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.380451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.380793 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.380806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.380824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.380835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.381375 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.402889 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.415821 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.431000 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.442256 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.457047 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.486130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.486167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.486176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.486190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.486199 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.487168 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.510620 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.526553 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.541615 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.554301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.571879 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.588881 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.602433 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.612118 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.629767 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"08-24T17:21:41Z\\\\nE0308 05:27:54.469410 6482 node_controller_manager.go:162] Stopping node network controller manager, err=failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z\\\\nI0308 05:27:54.469420 6482 nad_controller.go:166] [node-nad-controller NAD controller]: shutting down\\\\nI0308 05:27:54.469515 6482 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29105\\\\\\\"\\\\nI0308 05:27:54.469559 6482 ovnkube.go:595] Stopping ovnkube...\\\\nI0308 05:27:54.469568 6482 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2d0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x2a}\\\\nI0308 05:27:54.469630 6482 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.641808 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.658176 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.678138 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.691819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.691876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.691893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.691914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.691934 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.694878 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.706823 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.726224 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.744050 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.763088 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.781342 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.781347 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.781417 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.781169 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: E0308 05:27:55.782095 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.782174 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:55 crc kubenswrapper[4717]: E0308 05:27:55.782266 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:55 crc kubenswrapper[4717]: E0308 05:27:55.782399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:55 crc kubenswrapper[4717]: E0308 05:27:55.782459 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.793824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.793904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.793924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.794015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.794042 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.803607 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.814518 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.828188 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:55Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.896781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.896816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.896827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.896843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.896855 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.999508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.999560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.999572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:55 crc kubenswrapper[4717]: I0308 05:27:55.999592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:55.999607 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:55Z","lastTransitionTime":"2026-03-08T05:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.102539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.102596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.102611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.102631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.102642 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.205787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.205844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.205856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.205875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.205887 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.295214 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/0.log" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.299382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.300224 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.305190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" event={"ID":"a5c6317f-efb5-4d91-b5df-c56e975f7c1c","Type":"ContainerStarted","Data":"bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.308488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.308640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.308976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.309258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.309520 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.315042 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.347541 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"08-24T17:21:41Z\\\\nE0308 05:27:54.469410 6482 node_controller_manager.go:162] Stopping node network controller manager, err=failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z\\\\nI0308 05:27:54.469420 6482 nad_controller.go:166] [node-nad-controller NAD controller]: shutting down\\\\nI0308 05:27:54.469515 6482 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29105\\\\\\\"\\\\nI0308 05:27:54.469559 6482 ovnkube.go:595] Stopping ovnkube...\\\\nI0308 05:27:54.469568 6482 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2d0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x2a}\\\\nI0308 05:27:54.469630 6482 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.367142 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.383993 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.403979 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.414413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.414471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.414490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.414512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.414527 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.420090 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.438809 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.501270 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.517030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.517082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.517099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.517120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.517136 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.520775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.535808 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.552775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.568966 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.583656 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.605517 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.620592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.620656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.620675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.620727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.620745 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.621095 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.636190 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.677268 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.700588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.718494 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.723002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.723048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.723062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.723079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.723092 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.730044 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.750341 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.769701 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.786243 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.801954 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.825047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.825099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.825109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.825124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.825134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.838225 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"08-24T17:21:41Z\\\\nE0308 05:27:54.469410 6482 node_controller_manager.go:162] Stopping node network controller manager, err=failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z\\\\nI0308 05:27:54.469420 6482 nad_controller.go:166] [node-nad-controller NAD controller]: shutting down\\\\nI0308 05:27:54.469515 6482 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29105\\\\\\\"\\\\nI0308 05:27:54.469559 6482 ovnkube.go:595] Stopping ovnkube...\\\\nI0308 05:27:54.469568 6482 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2d0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x2a}\\\\nI0308 05:27:54.469630 6482 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.855062 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.877048 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.889715 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.903450 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.915627 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.925148 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.927133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.927277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.927365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.927428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.927482 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:56Z","lastTransitionTime":"2026-03-08T05:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:56 crc kubenswrapper[4717]: I0308 05:27:56.940665 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.030091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.030715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.030856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.030945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.031027 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.134318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.134360 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.134374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.134425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.134438 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.236859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.237144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.237275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.237365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.237448 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.311218 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/1.log" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.312445 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/0.log" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.315293 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f" exitCode=1 Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.315345 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.315384 4717 scope.go:117] "RemoveContainer" containerID="8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.316540 4717 scope.go:117] "RemoveContainer" containerID="5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f" Mar 08 05:27:57 crc kubenswrapper[4717]: E0308 05:27:57.316892 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.340754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.340794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.340806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.340822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.340835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.341530 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.359539 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.383158 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.399109 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.414580 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.428772 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.443582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.443613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.443624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.443641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.443653 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.453509 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.469513 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.490771 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.509152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.526223 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.546497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.546564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.546586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.546614 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.546635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.558072 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b32cb29ba330cedfd1242db6d56b9cd1d13637dae4c2863ea07b6ec7d632a4e\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"message\\\":\\\"08-24T17:21:41Z\\\\nE0308 05:27:54.469410 6482 node_controller_manager.go:162] Stopping node network controller manager, err=failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:54Z is after 2025-08-24T17:21:41Z\\\\nI0308 05:27:54.469420 6482 nad_controller.go:166] [node-nad-controller NAD controller]: shutting down\\\\nI0308 05:27:54.469515 6482 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29105\\\\\\\"\\\\nI0308 05:27:54.469559 6482 ovnkube.go:595] Stopping ovnkube...\\\\nI0308 05:27:54.469568 6482 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2d0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x2a}\\\\nI0308 05:27:54.469630 6482 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.579499 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.597052 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.616769 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.636521 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:57Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.648964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.649029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.649054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.649082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.649104 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.751855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.751917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.751935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.751959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.751978 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.780663 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.780766 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.780772 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:57 crc kubenswrapper[4717]: E0308 05:27:57.780947 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.781002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:57 crc kubenswrapper[4717]: E0308 05:27:57.781188 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:57 crc kubenswrapper[4717]: E0308 05:27:57.781344 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:57 crc kubenswrapper[4717]: E0308 05:27:57.781471 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.854497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.854537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.854549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.854568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.854580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.957204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.957261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.957278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.957300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:57 crc kubenswrapper[4717]: I0308 05:27:57.957316 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:57Z","lastTransitionTime":"2026-03-08T05:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.059427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.059475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.059491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.059514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.059532 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.164081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.164143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.164162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.164181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.164194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.268479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.268592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.268622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.268656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.268679 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.323449 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/1.log" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.329349 4717 scope.go:117] "RemoveContainer" containerID="5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f" Mar 08 05:27:58 crc kubenswrapper[4717]: E0308 05:27:58.329543 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.347195 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.363163 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.372400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.372465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.372481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.372506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.372524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.380988 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.411948 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.429536 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.448656 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.474193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.475470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.475535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.475557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.475584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.475603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.490841 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.523542 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.546844 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.568964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.578644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.578708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.578721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.578738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.578750 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.591740 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.615507 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.634667 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.652135 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.672599 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:27:58Z is after 2025-08-24T17:21:41Z" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.682127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.682198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.682221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.682246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.682268 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.784900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.784957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.784970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.784986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.785000 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.887902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.887990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.888013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.888042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.888065 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.991074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.991158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.991184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.991216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:58 crc kubenswrapper[4717]: I0308 05:27:58.991247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:58Z","lastTransitionTime":"2026-03-08T05:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.095165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.095248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.095267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.095292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.095313 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.198475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.198550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.198569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.198593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.198615 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.304221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.304284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.304302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.304325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.304341 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.406785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.406856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.406878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.406906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.406926 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.510168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.510215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.510227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.510243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.510257 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.613778 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.613820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.613832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.613849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.613860 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.717801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.718197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.718326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.718468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.718668 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.781105 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.781209 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:27:59 crc kubenswrapper[4717]: E0308 05:27:59.781303 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.781218 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:27:59 crc kubenswrapper[4717]: E0308 05:27:59.781369 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.781308 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:27:59 crc kubenswrapper[4717]: E0308 05:27:59.781613 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:27:59 crc kubenswrapper[4717]: E0308 05:27:59.781720 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.822462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.822527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.822578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.822607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.822627 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.926195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.926558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.926776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.926988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:27:59 crc kubenswrapper[4717]: I0308 05:27:59.927169 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:27:59Z","lastTransitionTime":"2026-03-08T05:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.030809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.030876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.030895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.030925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.030948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.135286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.135319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.135329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.135345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.135357 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.238136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.238784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.238977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.239150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.239298 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.342924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.343023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.343042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.343071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.343093 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.446537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.446616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.446641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.446673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.446733 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.549841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.550403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.550557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.550722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.550870 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.654482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.654597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.654622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.654658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.654719 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.759001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.759072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.759095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.759123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.759152 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.862074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.862132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.862150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.862175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.862193 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.964806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.964886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.964909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.964941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:00 crc kubenswrapper[4717]: I0308 05:28:00.964965 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:00Z","lastTransitionTime":"2026-03-08T05:28:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.068755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.069104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.069282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.069466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.069601 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.173128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.173191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.173209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.173233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.173251 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.275932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.276008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.276028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.276058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.276078 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.379229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.379291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.379309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.379334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.379377 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.483117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.483189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.483209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.483234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.483253 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.587651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.587751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.587770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.587800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.587819 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.691207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.691284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.691302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.691329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.691345 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.781152 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.781240 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.781348 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:01 crc kubenswrapper[4717]: E0308 05:28:01.781379 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:01 crc kubenswrapper[4717]: E0308 05:28:01.781550 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.781844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:01 crc kubenswrapper[4717]: E0308 05:28:01.781841 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:01 crc kubenswrapper[4717]: E0308 05:28:01.782085 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.794815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.794881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.794902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.794939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.794967 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.899129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.899222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.899250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.899287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:01 crc kubenswrapper[4717]: I0308 05:28:01.899312 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:01Z","lastTransitionTime":"2026-03-08T05:28:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.002936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.003009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.003047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.003082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.003104 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.107462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.107535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.107553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.107583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.107608 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.211909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.211987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.212008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.212041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.212061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.315832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.315909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.315929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.315961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.315982 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.418844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.419353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.419371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.419397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.419419 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.522904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.522967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.522990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.523019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.523039 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.626077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.626125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.626185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.626209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.626231 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.729496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.729567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.729586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.729615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.729633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.832373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.832464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.832487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.832519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.832541 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.935873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.936006 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.936028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.936056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:02 crc kubenswrapper[4717]: I0308 05:28:02.936074 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:02Z","lastTransitionTime":"2026-03-08T05:28:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.039420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.039497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.039526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.039557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.039580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.143520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.143602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.143622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.143652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.143676 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.248432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.248509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.248530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.248562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.248584 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.351931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.352001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.352019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.352046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.352070 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.455583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.455656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.455676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.455748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.455767 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.559332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.559400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.559424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.559452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.559471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.662736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.662844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.662871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.662901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.662920 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.766897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.767017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.767045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.767082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.767107 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.781627 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.781670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.781835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:03 crc kubenswrapper[4717]: E0308 05:28:03.782114 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.782207 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:03 crc kubenswrapper[4717]: E0308 05:28:03.782597 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:03 crc kubenswrapper[4717]: E0308 05:28:03.782663 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:03 crc kubenswrapper[4717]: E0308 05:28:03.782913 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.811369 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.833546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.857424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.870674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.870798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.870820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.870848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.870866 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.884375 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.923753 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.950078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.968232 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.974591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.974663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.974720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.974760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.974784 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:03Z","lastTransitionTime":"2026-03-08T05:28:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:03 crc kubenswrapper[4717]: I0308 05:28:03.987023 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.023476 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.047452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.069776 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.077538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.077591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.077610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.077638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.077658 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.086774 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.105829 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.127350 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.147113 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.171283 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.180832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.180912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.180935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.180968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.180989 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.284287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.284352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.284375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.284551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.284581 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.388255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.388303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.388317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.388335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.388347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.491842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.491925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.491948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.491978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.492001 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.578260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.578316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.578327 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.578347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.578364 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.591717 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.597068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.597112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.597127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.597144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.597156 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.618116 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.622174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.622226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.622237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.622258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.622269 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.637502 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.642448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.642501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.642512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.642531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.642542 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.661610 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.666781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.666854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.666897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.666935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.666962 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.687865 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:04 crc kubenswrapper[4717]: E0308 05:28:04.688034 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.690552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.690597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.690606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.690621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.690630 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.782407 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.793459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.793526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.793547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.793576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.793597 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.897036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.897105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.897122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.897150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:04 crc kubenswrapper[4717]: I0308 05:28:04.897170 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:04Z","lastTransitionTime":"2026-03-08T05:28:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.000123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.000180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.000199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.000225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.000243 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.103461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.103535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.103582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.103611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.103632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.206923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.207001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.207023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.207051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.207068 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.311050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.311134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.311160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.311191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.311214 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.361471 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.364666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.365342 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.380042 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.398564 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.415239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.415302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.415332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.415368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.415391 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.418062 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.437888 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.472643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.492776 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.513643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.519430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.519484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.519504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.519538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.519565 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.534710 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.555377 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.559588 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.559859 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.55981473 +0000 UTC m=+144.477463604 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.559943 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.560194 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.560301 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.560275442 +0000 UTC m=+144.477924326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.560329 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.560206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.560414 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.560392715 +0000 UTC m=+144.478041769 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.575750 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.596017 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.611155 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.623717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.623795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.623849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.623886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.623913 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.634592 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.656676 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.674398 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.711600 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:05Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.726919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.726988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.727008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.727042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.727064 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.761875 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.761970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.762049 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762093 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762131 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762135 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762147 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762158 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762173 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762232 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.762205213 +0000 UTC m=+144.679854067 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762235 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762253 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.762244634 +0000 UTC m=+144.679893488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.762315 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:28:37.762296165 +0000 UTC m=+144.679945009 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.781580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.781627 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.781663 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.781621 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.781772 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.781926 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.782077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:05 crc kubenswrapper[4717]: E0308 05:28:05.782169 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.830005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.830059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.830067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.830083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.830094 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.933191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.933237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.933250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.933269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:05 crc kubenswrapper[4717]: I0308 05:28:05.933284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:05Z","lastTransitionTime":"2026-03-08T05:28:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.041206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.041280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.041292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.041311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.041323 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.144376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.144441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.144460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.144488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.144509 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.248409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.248479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.248500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.248529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.248577 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.352449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.352518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.352537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.352569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.352588 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.455790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.455867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.455891 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.455962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.455986 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.558985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.559060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.559089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.559122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.559144 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.662824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.663389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.663543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.663719 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.663914 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.768328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.768402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.768449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.768478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.768495 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.871351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.871414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.871435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.871463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.871483 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.974857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.974920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.974935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.974957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:06 crc kubenswrapper[4717]: I0308 05:28:06.974971 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:06Z","lastTransitionTime":"2026-03-08T05:28:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.078299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.078371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.078392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.078424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.078446 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.181779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.181858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.181882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.181914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.181935 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.285190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.285251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.285269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.285292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.285312 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.388032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.388849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.388887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.388920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.388956 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.491983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.492063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.492083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.492121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.492143 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.595388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.595438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.595455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.595479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.595496 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.698550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.698644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.698677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.698756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.698779 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.781041 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.781114 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:07 crc kubenswrapper[4717]: E0308 05:28:07.781300 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.781334 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:07 crc kubenswrapper[4717]: E0308 05:28:07.781482 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:07 crc kubenswrapper[4717]: E0308 05:28:07.781624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.782367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:07 crc kubenswrapper[4717]: E0308 05:28:07.782586 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.802055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.802115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.802131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.802180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.802199 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.905936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.905994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.906007 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.906031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:07 crc kubenswrapper[4717]: I0308 05:28:07.906048 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:07Z","lastTransitionTime":"2026-03-08T05:28:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.009234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.009317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.009338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.009371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.009392 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.112967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.113022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.113039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.113061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.113075 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.217601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.217666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.217680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.217739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.217755 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.321175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.321227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.321263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.321286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.321300 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.425544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.425617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.425639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.425672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.425744 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.528940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.528979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.528992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.529011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.529022 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.632097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.632173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.632194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.632220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.632243 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.735379 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.735450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.735470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.735498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.735523 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.839255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.839348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.839380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.839421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.839451 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.943051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.943122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.943142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.943172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:08 crc kubenswrapper[4717]: I0308 05:28:08.943191 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:08Z","lastTransitionTime":"2026-03-08T05:28:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.046126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.046188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.046211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.046239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.046258 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.149771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.149836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.149855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.149880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.149899 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.254831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.254910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.254928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.254956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.254973 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.358340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.358389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.358409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.358439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.358457 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.461540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.461720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.461784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.461819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.461907 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.565462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.565540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.565565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.565595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.565616 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.669759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.669816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.669833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.669861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.669881 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.774378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.774443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.774465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.774498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.774521 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.781236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:09 crc kubenswrapper[4717]: E0308 05:28:09.781523 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.781981 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.782086 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:09 crc kubenswrapper[4717]: E0308 05:28:09.782167 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.782185 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:09 crc kubenswrapper[4717]: E0308 05:28:09.782378 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:09 crc kubenswrapper[4717]: E0308 05:28:09.782584 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.877792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.877870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.877892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.877919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.877939 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.982172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.982236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.982255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.982279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:09 crc kubenswrapper[4717]: I0308 05:28:09.982297 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:09Z","lastTransitionTime":"2026-03-08T05:28:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.085841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.085920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.085946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.085980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.086003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.189363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.189465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.189485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.189515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.189537 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.293116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.293245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.293322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.293408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.293447 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.396893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.396963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.396983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.397019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.397126 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.500896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.500981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.501009 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.501042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.501065 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.604268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.604341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.604361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.604388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.604415 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.707851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.707920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.707939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.707964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.708023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.811069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.811125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.811141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.811163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.811178 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.915155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.915203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.915213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.915231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:10 crc kubenswrapper[4717]: I0308 05:28:10.915243 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:10Z","lastTransitionTime":"2026-03-08T05:28:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.019237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.019293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.019311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.019337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.019479 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.122999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.123089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.123118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.123160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.123190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.226306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.226376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.226394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.226422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.226442 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.330154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.330232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.330252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.330281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.330304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.433150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.433225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.433252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.433280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.433300 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.536385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.536453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.536472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.536500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.536518 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.639841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.639932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.639952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.639985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.640005 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.742872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.742938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.742956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.742981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.742997 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.781111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.781213 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.781111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:11 crc kubenswrapper[4717]: E0308 05:28:11.781295 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.781378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:11 crc kubenswrapper[4717]: E0308 05:28:11.781493 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:11 crc kubenswrapper[4717]: E0308 05:28:11.781635 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:11 crc kubenswrapper[4717]: E0308 05:28:11.781780 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.846572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.846626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.846640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.846661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.846674 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.949695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.949735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.949745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.949761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:11 crc kubenswrapper[4717]: I0308 05:28:11.949773 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:11Z","lastTransitionTime":"2026-03-08T05:28:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.052211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.052253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.052266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.052283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.052295 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.154819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.154882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.154900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.154924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.154942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.258966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.259036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.259058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.259087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.259108 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.362755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.362841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.362860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.362886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.362907 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.466073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.466122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.466139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.466161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.466180 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.568736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.568798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.568816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.568844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.568862 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.671270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.671313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.671322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.671336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.671347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.774288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.774344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.774361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.774384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.774401 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.782115 4717 scope.go:117] "RemoveContainer" containerID="5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.876981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.877525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.877548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.877985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.878006 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.981173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.981206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.981221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.981236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:12 crc kubenswrapper[4717]: I0308 05:28:12.981244 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:12Z","lastTransitionTime":"2026-03-08T05:28:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.083222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.083267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.083281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.083301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.083314 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.190064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.190130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.190146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.190169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.190187 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.292799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.292834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.292842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.292856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.292865 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.395013 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.395042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.395051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.395066 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.395077 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.405710 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/1.log" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.408547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.409181 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.431609 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.451590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.465863 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.483565 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.497858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.497918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.497932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.497948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.497960 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.500079 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.515246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.525800 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.536057 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.548032 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.561030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.582279 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.599886 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.600847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.600899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.600913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.600933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.600946 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.613359 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.629060 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.640574 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.653972 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.703265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.703310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.703319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.703334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.703347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:13Z","lastTransitionTime":"2026-03-08T05:28:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.780712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.780810 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.780826 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.780870 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.780969 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.780888 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.781150 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.781224 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.797207 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.803871 4717 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.809949 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.832893 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.849909 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.862584 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: E0308 05:28:13.876450 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.881421 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.893043 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.905048 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.927092 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.943200 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.956674 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.969358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.982830 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:13 crc kubenswrapper[4717]: I0308 05:28:13.997146 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.008290 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.018865 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.415403 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/2.log" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.416776 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/1.log" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.421014 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" exitCode=1 Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.421052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623"} Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.421092 4717 scope.go:117] "RemoveContainer" containerID="5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.423731 4717 scope.go:117] "RemoveContainer" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.425377 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.443591 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.475801 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.496643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.515507 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.532359 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.606789 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.618936 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.631653 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.647002 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.665489 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.678471 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.702034 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5edd2d6928d1be2129aba1d23a8421746be723236f5156ccf8d00cf0c3a8657f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"message\\\":\\\"l\\\\nI0308 05:27:56.598907 6699 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0308 05:27:56.598969 6699 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 05:27:56.598987 6699 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 05:27:56.599023 6699 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:27:56.599045 6699 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:27:56.599058 6699 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:27:56.599069 6699 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:27:56.599772 6699 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:27:56.599845 6699 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 05:27:56.599858 6699 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 05:27:56.599905 6699 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 05:27:56.599932 6699 factory.go:656] Stopping watch factory\\\\nI0308 05:27:56.599950 6699 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:27:56.599979 6699 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:27:56.600005 6699 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:27:56.600019 6699 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.712620 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.712663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.712708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.712733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.712752 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:14Z","lastTransitionTime":"2026-03-08T05:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.722668 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.729497 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.733966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.734008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.734028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.734054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.734072 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:14Z","lastTransitionTime":"2026-03-08T05:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.737500 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.751575 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.756879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.757015 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.757039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.757064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.757127 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:14Z","lastTransitionTime":"2026-03-08T05:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.757466 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.771524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.775243 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.778824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.778881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.778899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.778924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.778944 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:14Z","lastTransitionTime":"2026-03-08T05:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.793501 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.796565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.796763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.796914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.797042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:14 crc kubenswrapper[4717]: I0308 05:28:14.797154 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:14Z","lastTransitionTime":"2026-03-08T05:28:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.811762 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:14 crc kubenswrapper[4717]: E0308 05:28:14.811949 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.426492 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/2.log" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.431158 4717 scope.go:117] "RemoveContainer" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" Mar 08 05:28:15 crc kubenswrapper[4717]: E0308 05:28:15.431332 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.451653 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.466138 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.487391 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.506599 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.521800 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.552791 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.572619 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.586548 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.610021 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.626287 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.645572 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.676852 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.694731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.710964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.726623 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.744202 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:15Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.780750 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.780890 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.780776 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:15 crc kubenswrapper[4717]: I0308 05:28:15.781002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:15 crc kubenswrapper[4717]: E0308 05:28:15.780987 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:15 crc kubenswrapper[4717]: E0308 05:28:15.781212 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:15 crc kubenswrapper[4717]: E0308 05:28:15.781368 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:15 crc kubenswrapper[4717]: E0308 05:28:15.781515 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:17 crc kubenswrapper[4717]: I0308 05:28:17.781383 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:17 crc kubenswrapper[4717]: I0308 05:28:17.781404 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:17 crc kubenswrapper[4717]: I0308 05:28:17.781474 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:17 crc kubenswrapper[4717]: I0308 05:28:17.781546 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:17 crc kubenswrapper[4717]: E0308 05:28:17.781738 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:17 crc kubenswrapper[4717]: E0308 05:28:17.781868 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:17 crc kubenswrapper[4717]: E0308 05:28:17.781995 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:17 crc kubenswrapper[4717]: E0308 05:28:17.782241 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:17 crc kubenswrapper[4717]: I0308 05:28:17.798184 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 08 05:28:18 crc kubenswrapper[4717]: E0308 05:28:18.878348 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:19 crc kubenswrapper[4717]: I0308 05:28:19.781046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:19 crc kubenswrapper[4717]: I0308 05:28:19.781096 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:19 crc kubenswrapper[4717]: E0308 05:28:19.781763 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:19 crc kubenswrapper[4717]: I0308 05:28:19.781230 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:19 crc kubenswrapper[4717]: I0308 05:28:19.781206 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:19 crc kubenswrapper[4717]: E0308 05:28:19.781899 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:19 crc kubenswrapper[4717]: E0308 05:28:19.782038 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:19 crc kubenswrapper[4717]: E0308 05:28:19.782144 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.456438 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.478981 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.494296 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.510898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.526983 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.556315 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.578874 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.595108 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.614840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.627951 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.652829 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.668774 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.684324 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.699570 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.718438 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.738172 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.752840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.774370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:21Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.780810 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.780882 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.781205 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:21 crc kubenswrapper[4717]: I0308 05:28:21.781403 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:21 crc kubenswrapper[4717]: E0308 05:28:21.781519 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:21 crc kubenswrapper[4717]: E0308 05:28:21.781414 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:21 crc kubenswrapper[4717]: E0308 05:28:21.781584 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:21 crc kubenswrapper[4717]: E0308 05:28:21.781643 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.780739 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.780844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.780785 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.780846 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:23 crc kubenswrapper[4717]: E0308 05:28:23.780911 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:23 crc kubenswrapper[4717]: E0308 05:28:23.781023 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:23 crc kubenswrapper[4717]: E0308 05:28:23.781164 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:23 crc kubenswrapper[4717]: E0308 05:28:23.781264 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.792191 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.804786 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.817067 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.835010 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.846960 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.859838 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.871084 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: E0308 05:28:23.878890 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.888971 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.902420 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.919789 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.940659 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.958488 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.970097 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.981151 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:23 crc kubenswrapper[4717]: I0308 05:28:23.999867 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:23Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:24 crc kubenswrapper[4717]: I0308 05:28:24.033845 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:24Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:24 crc kubenswrapper[4717]: I0308 05:28:24.049270 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:24Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.202144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.202224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.202248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.202277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.202295 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:25Z","lastTransitionTime":"2026-03-08T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.222865 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:25Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.228011 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.228143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.228163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.228231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.228255 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:25Z","lastTransitionTime":"2026-03-08T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.248517 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:25Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.253592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.253678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.253757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.253780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.253797 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:25Z","lastTransitionTime":"2026-03-08T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.274483 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:25Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.279578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.279626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.279643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.279670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.279728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:25Z","lastTransitionTime":"2026-03-08T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.300054 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:25Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.304477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.304533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.304555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.304582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.304603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:25Z","lastTransitionTime":"2026-03-08T05:28:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.327134 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:25Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.327496 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.781255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.781330 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.781384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:25 crc kubenswrapper[4717]: I0308 05:28:25.781330 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.781545 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.781990 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.782113 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:25 crc kubenswrapper[4717]: E0308 05:28:25.782222 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:26 crc kubenswrapper[4717]: I0308 05:28:26.782718 4717 scope.go:117] "RemoveContainer" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" Mar 08 05:28:26 crc kubenswrapper[4717]: E0308 05:28:26.782981 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:27 crc kubenswrapper[4717]: I0308 05:28:27.781239 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:27 crc kubenswrapper[4717]: I0308 05:28:27.781337 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:27 crc kubenswrapper[4717]: I0308 05:28:27.781362 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:27 crc kubenswrapper[4717]: I0308 05:28:27.781280 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:27 crc kubenswrapper[4717]: E0308 05:28:27.781474 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:27 crc kubenswrapper[4717]: E0308 05:28:27.781601 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:27 crc kubenswrapper[4717]: E0308 05:28:27.781769 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:27 crc kubenswrapper[4717]: E0308 05:28:27.781858 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:28 crc kubenswrapper[4717]: E0308 05:28:28.880666 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:29 crc kubenswrapper[4717]: I0308 05:28:29.781180 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:29 crc kubenswrapper[4717]: I0308 05:28:29.781223 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:29 crc kubenswrapper[4717]: E0308 05:28:29.781377 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:29 crc kubenswrapper[4717]: I0308 05:28:29.781438 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:29 crc kubenswrapper[4717]: E0308 05:28:29.781612 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:29 crc kubenswrapper[4717]: E0308 05:28:29.781721 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:29 crc kubenswrapper[4717]: I0308 05:28:29.781952 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:29 crc kubenswrapper[4717]: E0308 05:28:29.782214 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:31 crc kubenswrapper[4717]: I0308 05:28:31.781215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:31 crc kubenswrapper[4717]: I0308 05:28:31.781317 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:31 crc kubenswrapper[4717]: I0308 05:28:31.781360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:31 crc kubenswrapper[4717]: I0308 05:28:31.781279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:31 crc kubenswrapper[4717]: E0308 05:28:31.782218 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:31 crc kubenswrapper[4717]: E0308 05:28:31.782750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:31 crc kubenswrapper[4717]: E0308 05:28:31.782921 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:31 crc kubenswrapper[4717]: E0308 05:28:31.782657 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.781572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.781675 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:33 crc kubenswrapper[4717]: E0308 05:28:33.781806 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.781926 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:33 crc kubenswrapper[4717]: E0308 05:28:33.782146 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.782177 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:33 crc kubenswrapper[4717]: E0308 05:28:33.782404 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:33 crc kubenswrapper[4717]: E0308 05:28:33.782607 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.798442 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.819887 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.844526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.866030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: E0308 05:28:33.881839 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.891222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.911858 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.932509 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.954903 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.969032 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.982133 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:33 crc kubenswrapper[4717]: I0308 05:28:33.999569 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:33Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.020230 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.033968 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.063816 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.084231 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.099554 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.117135 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:34 crc kubenswrapper[4717]: I0308 05:28:34.133774 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:34Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.608115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.608167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.608180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.608198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.608210 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:35Z","lastTransitionTime":"2026-03-08T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.631170 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:35Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.637674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.637733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.637745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.637760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.637772 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:35Z","lastTransitionTime":"2026-03-08T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.656259 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:35Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.661830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.661890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.661913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.661942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.661962 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:35Z","lastTransitionTime":"2026-03-08T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.685657 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:35Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.691341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.691374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.691384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.691401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.691413 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:35Z","lastTransitionTime":"2026-03-08T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.713671 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:35Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.720383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.720509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.720541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.720565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.720585 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:35Z","lastTransitionTime":"2026-03-08T05:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.744211 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:35Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.744378 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.781418 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.781485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.781532 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.781598 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:35 crc kubenswrapper[4717]: I0308 05:28:35.781740 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.781794 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.781978 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:35 crc kubenswrapper[4717]: E0308 05:28:35.782245 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.542507 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/0.log" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.542587 4717 generic.go:334] "Generic (PLEG): container finished" podID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" containerID="ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8" exitCode=1 Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.542660 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerDied","Data":"ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8"} Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.543595 4717 scope.go:117] "RemoveContainer" containerID="ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.568650 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.588450 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.606039 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.623402 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.649137 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.671624 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.688840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.720826 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.742011 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.762556 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.783627 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.801604 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.839651 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.870314 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.893557 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.910152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.926218 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:36 crc kubenswrapper[4717]: I0308 05:28:36.941583 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:36Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.551264 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/0.log" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.551360 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerStarted","Data":"251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6"} Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.564074 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.564339 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.564281583 +0000 UTC m=+208.481930547 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.564518 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.564607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.564731 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.564900 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.564871968 +0000 UTC m=+208.482520842 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.564914 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.565048 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.565016752 +0000 UTC m=+208.482665636 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.576524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.601265 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.620089 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.652915 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.674820 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.695125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.719002 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.734138 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.749991 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.769013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.769207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769250 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769389 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs podName:dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7 nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.769352609 +0000 UTC m=+208.687001493 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs") pod "network-metrics-daemon-d64q9" (UID: "dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769496 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769549 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769580 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.769734 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.769669147 +0000 UTC m=+208.687318141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.769891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.770128 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.770176 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.770201 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.770283 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 05:29:41.770259253 +0000 UTC m=+208.687908257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.771186 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.780736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.780918 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.781217 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.781322 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.781561 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.781660 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.781965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:37 crc kubenswrapper[4717]: E0308 05:28:37.782111 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.794568 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.832078 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.857304 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.879138 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.904404 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.925816 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.945659 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:37 crc kubenswrapper[4717]: I0308 05:28:37.963881 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:37Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:38 crc kubenswrapper[4717]: E0308 05:28:38.883331 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:39 crc kubenswrapper[4717]: I0308 05:28:39.784933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:39 crc kubenswrapper[4717]: I0308 05:28:39.785118 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:39 crc kubenswrapper[4717]: E0308 05:28:39.785191 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:39 crc kubenswrapper[4717]: I0308 05:28:39.785250 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:39 crc kubenswrapper[4717]: I0308 05:28:39.785262 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:39 crc kubenswrapper[4717]: E0308 05:28:39.785476 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:39 crc kubenswrapper[4717]: E0308 05:28:39.785553 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:39 crc kubenswrapper[4717]: E0308 05:28:39.785659 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:39 crc kubenswrapper[4717]: I0308 05:28:39.787085 4717 scope.go:117] "RemoveContainer" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.574554 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/2.log" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.578388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.578947 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.596546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.610121 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.623582 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.651605 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.667427 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.680878 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.703904 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.718464 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.739778 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.761711 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.778675 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.796515 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.835637 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.857546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.877137 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.894480 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.910057 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:40 crc kubenswrapper[4717]: I0308 05:28:40.923074 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:40Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.585339 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/3.log" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.586280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/2.log" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.590136 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" exitCode=1 Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.590201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.590304 4717 scope.go:117] "RemoveContainer" containerID="f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.591214 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:28:41 crc kubenswrapper[4717]: E0308 05:28:41.591462 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.614589 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.647081 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.672130 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.693442 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.710884 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.729031 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.748860 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.766180 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.781320 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.781396 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:41 crc kubenswrapper[4717]: E0308 05:28:41.781582 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.781615 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.781659 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:41 crc kubenswrapper[4717]: E0308 05:28:41.781782 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.781631 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:41 crc kubenswrapper[4717]: E0308 05:28:41.781924 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:41 crc kubenswrapper[4717]: E0308 05:28:41.782040 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.796351 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.811976 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.832599 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.846643 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.889255 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f31e125d812bffa9cdc165182200167ab488566d3998c5c0d437872543f66623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:13Z\\\",\\\"message\\\":\\\"andler 7 for removal\\\\nI0308 05:28:13.692833 6932 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.692837 6932 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:13.692842 6932 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:13.692853 6932 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:13.692859 6932 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:13.692956 6932 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693014 6932 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693263 6932 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0308 05:28:13.693502 6932 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0308 05:28:13.693700 6932 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:13.693731 6932 factory.go:656] Stopping watch factory\\\\nI0308 05:28:13.693745 6932 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.910399 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.930339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.951178 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:41 crc kubenswrapper[4717]: I0308 05:28:41.968374 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:41Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.597153 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/3.log" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.602709 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:28:42 crc kubenswrapper[4717]: E0308 05:28:42.602878 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.623933 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.645174 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.661389 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.691571 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.712773 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.730520 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.751165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.768604 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.803594 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.822963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.838299 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.855548 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.870451 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.886174 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.902620 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.924114 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.938871 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:42 crc kubenswrapper[4717]: I0308 05:28:42.957417 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:42Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.781070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.781180 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.781269 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.781484 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:43 crc kubenswrapper[4717]: E0308 05:28:43.781460 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:43 crc kubenswrapper[4717]: E0308 05:28:43.781671 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:43 crc kubenswrapper[4717]: E0308 05:28:43.781820 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:43 crc kubenswrapper[4717]: E0308 05:28:43.781994 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.807432 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.831928 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.868969 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: E0308 05:28:43.885377 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.893358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.913201 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.950490 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.969216 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:43 crc kubenswrapper[4717]: I0308 05:28:43.994394 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:43Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.013368 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.030032 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.049418 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.068571 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.089330 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.110642 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.126766 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.141828 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.166329 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:44 crc kubenswrapper[4717]: I0308 05:28:44.182503 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:44Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.780914 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.781438 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.781835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.781907 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.781952 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.782012 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.782196 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.782311 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.796550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.796590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.796606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.796626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.796640 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:45Z","lastTransitionTime":"2026-03-08T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.817965 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.821928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.821963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.821975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.821991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.822003 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:45Z","lastTransitionTime":"2026-03-08T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.835409 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.839008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.839045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.839057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.839073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.839084 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:45Z","lastTransitionTime":"2026-03-08T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.860006 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.864029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.864061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.864072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.864089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.864100 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:45Z","lastTransitionTime":"2026-03-08T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.885610 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.889573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.889606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.889619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.889637 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:45 crc kubenswrapper[4717]: I0308 05:28:45.889649 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:45Z","lastTransitionTime":"2026-03-08T05:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.909087 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:45Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:45 crc kubenswrapper[4717]: E0308 05:28:45.909436 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:47 crc kubenswrapper[4717]: I0308 05:28:47.781642 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:47 crc kubenswrapper[4717]: I0308 05:28:47.781762 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:47 crc kubenswrapper[4717]: E0308 05:28:47.781876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:47 crc kubenswrapper[4717]: E0308 05:28:47.782028 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:47 crc kubenswrapper[4717]: I0308 05:28:47.782151 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:47 crc kubenswrapper[4717]: E0308 05:28:47.782252 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:47 crc kubenswrapper[4717]: I0308 05:28:47.782344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:47 crc kubenswrapper[4717]: E0308 05:28:47.782468 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:48 crc kubenswrapper[4717]: E0308 05:28:48.887864 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:49 crc kubenswrapper[4717]: I0308 05:28:49.780933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:49 crc kubenswrapper[4717]: I0308 05:28:49.781076 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:49 crc kubenswrapper[4717]: I0308 05:28:49.780994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:49 crc kubenswrapper[4717]: I0308 05:28:49.780983 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:49 crc kubenswrapper[4717]: E0308 05:28:49.781250 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:49 crc kubenswrapper[4717]: E0308 05:28:49.781470 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:49 crc kubenswrapper[4717]: E0308 05:28:49.781575 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:49 crc kubenswrapper[4717]: E0308 05:28:49.781732 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:51 crc kubenswrapper[4717]: I0308 05:28:51.781495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:51 crc kubenswrapper[4717]: I0308 05:28:51.781562 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:51 crc kubenswrapper[4717]: I0308 05:28:51.781608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:51 crc kubenswrapper[4717]: I0308 05:28:51.781512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:51 crc kubenswrapper[4717]: E0308 05:28:51.781755 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:51 crc kubenswrapper[4717]: E0308 05:28:51.781897 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:51 crc kubenswrapper[4717]: E0308 05:28:51.781981 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:51 crc kubenswrapper[4717]: E0308 05:28:51.782018 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.782068 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.782242 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.782284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.782378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:53 crc kubenswrapper[4717]: E0308 05:28:53.782365 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:53 crc kubenswrapper[4717]: E0308 05:28:53.782572 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:53 crc kubenswrapper[4717]: E0308 05:28:53.782872 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:53 crc kubenswrapper[4717]: E0308 05:28:53.782994 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.809070 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.831995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.851229 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.885216 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: E0308 05:28:53.888784 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.908217 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.929125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.957098 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:53 crc kubenswrapper[4717]: I0308 05:28:53.979266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:53Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.004490 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.026954 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.060161 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.083792 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.104212 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.122914 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.144363 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.163709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.183331 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:54 crc kubenswrapper[4717]: I0308 05:28:54.201075 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:54Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:55 crc kubenswrapper[4717]: I0308 05:28:55.780768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:55 crc kubenswrapper[4717]: I0308 05:28:55.780844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:55 crc kubenswrapper[4717]: I0308 05:28:55.781006 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:55 crc kubenswrapper[4717]: I0308 05:28:55.781063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:55 crc kubenswrapper[4717]: E0308 05:28:55.781193 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:55 crc kubenswrapper[4717]: E0308 05:28:55.781322 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:55 crc kubenswrapper[4717]: E0308 05:28:55.781479 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:55 crc kubenswrapper[4717]: E0308 05:28:55.781894 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:55 crc kubenswrapper[4717]: I0308 05:28:55.797765 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.188273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.188329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.188346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.188376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.188393 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:56Z","lastTransitionTime":"2026-03-08T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.249420 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.258019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.258113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.258145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.258182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.258210 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:56Z","lastTransitionTime":"2026-03-08T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.286298 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.291141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.291218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.291235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.291257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.291272 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:56Z","lastTransitionTime":"2026-03-08T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.310597 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.315030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.315110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.315129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.315160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.315180 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:56Z","lastTransitionTime":"2026-03-08T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.333592 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.338199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.338264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.338284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.338314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:28:56 crc kubenswrapper[4717]: I0308 05:28:56.338337 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:28:56Z","lastTransitionTime":"2026-03-08T05:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.359411 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:28:56Z is after 2025-08-24T17:21:41Z" Mar 08 05:28:56 crc kubenswrapper[4717]: E0308 05:28:56.359644 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:28:57 crc kubenswrapper[4717]: I0308 05:28:57.781386 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:57 crc kubenswrapper[4717]: I0308 05:28:57.781467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:57 crc kubenswrapper[4717]: E0308 05:28:57.781753 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:57 crc kubenswrapper[4717]: I0308 05:28:57.781842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:57 crc kubenswrapper[4717]: I0308 05:28:57.781878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:57 crc kubenswrapper[4717]: E0308 05:28:57.782096 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:57 crc kubenswrapper[4717]: E0308 05:28:57.782334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:57 crc kubenswrapper[4717]: E0308 05:28:57.783161 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:28:57 crc kubenswrapper[4717]: I0308 05:28:57.783786 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:28:57 crc kubenswrapper[4717]: E0308 05:28:57.784224 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:28:58 crc kubenswrapper[4717]: E0308 05:28:58.889839 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:28:59 crc kubenswrapper[4717]: I0308 05:28:59.781082 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:28:59 crc kubenswrapper[4717]: I0308 05:28:59.781085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:28:59 crc kubenswrapper[4717]: E0308 05:28:59.782164 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:28:59 crc kubenswrapper[4717]: I0308 05:28:59.781353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:28:59 crc kubenswrapper[4717]: I0308 05:28:59.781217 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:28:59 crc kubenswrapper[4717]: E0308 05:28:59.782367 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:28:59 crc kubenswrapper[4717]: E0308 05:28:59.782528 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:28:59 crc kubenswrapper[4717]: E0308 05:28:59.782750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:01 crc kubenswrapper[4717]: I0308 05:29:01.781175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:01 crc kubenswrapper[4717]: I0308 05:29:01.781276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:01 crc kubenswrapper[4717]: I0308 05:29:01.781339 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:01 crc kubenswrapper[4717]: I0308 05:29:01.781361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:01 crc kubenswrapper[4717]: E0308 05:29:01.781520 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:01 crc kubenswrapper[4717]: E0308 05:29:01.781720 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:01 crc kubenswrapper[4717]: E0308 05:29:01.781864 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:01 crc kubenswrapper[4717]: E0308 05:29:01.781968 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.781642 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.781834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:03 crc kubenswrapper[4717]: E0308 05:29:03.782144 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.782226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.782244 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:03 crc kubenswrapper[4717]: E0308 05:29:03.782642 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:03 crc kubenswrapper[4717]: E0308 05:29:03.782821 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:03 crc kubenswrapper[4717]: E0308 05:29:03.782929 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.809443 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.830908 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.847125 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.865390 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.887274 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: E0308 05:29:03.890630 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.911839 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.930411 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.957440 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.978576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:03 crc kubenswrapper[4717]: I0308 05:29:03.997200 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:03Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.027086 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.047365 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.065322 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dba9c32-2553-43d1-acf3-3528d9f98578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2545fdec6c746926828dba3c09dbe50ee9ee6551b533660dc8b1970df5395db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.101339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.124435 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.145073 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.167107 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.187254 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:04 crc kubenswrapper[4717]: I0308 05:29:04.210069 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:04Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:05 crc kubenswrapper[4717]: I0308 05:29:05.781568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:05 crc kubenswrapper[4717]: I0308 05:29:05.781612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:05 crc kubenswrapper[4717]: I0308 05:29:05.781650 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:05 crc kubenswrapper[4717]: E0308 05:29:05.781913 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:05 crc kubenswrapper[4717]: I0308 05:29:05.782063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:05 crc kubenswrapper[4717]: E0308 05:29:05.782198 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:05 crc kubenswrapper[4717]: E0308 05:29:05.782349 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:05 crc kubenswrapper[4717]: E0308 05:29:05.782483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.543155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.543606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.543789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.543991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.544141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:06Z","lastTransitionTime":"2026-03-08T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.568783 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:06Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.575916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.575981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.576001 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.576031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.576053 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:06Z","lastTransitionTime":"2026-03-08T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.598356 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:06Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.603801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.603852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.603872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.603895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.603913 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:06Z","lastTransitionTime":"2026-03-08T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.624915 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:06Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.631111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.631304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.631472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.631650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.631870 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:06Z","lastTransitionTime":"2026-03-08T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.655717 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:06Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.664776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.664850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.664989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.665053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:06 crc kubenswrapper[4717]: I0308 05:29:06.665084 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:06Z","lastTransitionTime":"2026-03-08T05:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.691222 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:06Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:06 crc kubenswrapper[4717]: E0308 05:29:06.691491 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:29:07 crc kubenswrapper[4717]: I0308 05:29:07.780833 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:07 crc kubenswrapper[4717]: I0308 05:29:07.780984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:07 crc kubenswrapper[4717]: I0308 05:29:07.780880 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:07 crc kubenswrapper[4717]: E0308 05:29:07.781111 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:07 crc kubenswrapper[4717]: I0308 05:29:07.781162 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:07 crc kubenswrapper[4717]: E0308 05:29:07.781370 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:07 crc kubenswrapper[4717]: E0308 05:29:07.781513 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:07 crc kubenswrapper[4717]: E0308 05:29:07.781667 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:08 crc kubenswrapper[4717]: E0308 05:29:08.892535 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:09 crc kubenswrapper[4717]: I0308 05:29:09.781617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:09 crc kubenswrapper[4717]: I0308 05:29:09.781445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:09 crc kubenswrapper[4717]: I0308 05:29:09.781666 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:09 crc kubenswrapper[4717]: I0308 05:29:09.781793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:09 crc kubenswrapper[4717]: E0308 05:29:09.781966 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:09 crc kubenswrapper[4717]: E0308 05:29:09.782112 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:09 crc kubenswrapper[4717]: E0308 05:29:09.782865 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:09 crc kubenswrapper[4717]: E0308 05:29:09.782957 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:09 crc kubenswrapper[4717]: I0308 05:29:09.783633 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:29:09 crc kubenswrapper[4717]: E0308 05:29:09.784161 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" Mar 08 05:29:11 crc kubenswrapper[4717]: I0308 05:29:11.781487 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:11 crc kubenswrapper[4717]: I0308 05:29:11.781668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:11 crc kubenswrapper[4717]: I0308 05:29:11.781812 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:11 crc kubenswrapper[4717]: I0308 05:29:11.781868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:11 crc kubenswrapper[4717]: E0308 05:29:11.783267 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:11 crc kubenswrapper[4717]: E0308 05:29:11.783330 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:11 crc kubenswrapper[4717]: E0308 05:29:11.783404 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:11 crc kubenswrapper[4717]: E0308 05:29:11.783467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.781041 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.781066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.781145 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.781376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:13 crc kubenswrapper[4717]: E0308 05:29:13.781790 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:13 crc kubenswrapper[4717]: E0308 05:29:13.782078 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:13 crc kubenswrapper[4717]: E0308 05:29:13.782321 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:13 crc kubenswrapper[4717]: E0308 05:29:13.782643 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.809152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.828843 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e2328d4b2b9f4ccdd32d24dabe561757010883be6a83f768284fa780db547c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zwsgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tb7pf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.858027 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c6317f-efb5-4d91-b5df-c56e975f7c1c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd99d56b3b55e06b1937bea0382d3e097fb8883be612fe1fb48ed82647d5d53b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a9f28cefabc8219e0736ea6346c64771549fbbbc4ad015c336eea3b6f8e3467\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b209cb4a119324fb162c39e8f62738a36cbe896c7a3ebbb9623260c94c35ec8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b54649a46c02887cf41194c9854ff63ea638b2175edcae3955ffb75b55daa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83ef4790ef7317ad97b4ea843a0837b16d790383615f354906e70ce10ac9090c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32ad5a4b1daeafbf1584d27267e8be50c2827eba1a42b0a5a926a2b1d7e75284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d191731d5d911874f2a78e2ee5b54267cae4c2d14e57c7e2c3acb0966f686de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddlf2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pkcrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.882940 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5508bbd-d773-4b40-a641-e538e619bc1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd0a6d614278886098c434ed2d09d856e648c7d4e9cda548a9bf21ff44146128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ae6209cba0969806179c19acb3c82e1aee42f648f5f208363f297b84ed3f97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvft8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q67qb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: E0308 05:29:13.894558 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.902128 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dba9c32-2553-43d1-acf3-3528d9f98578\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2545fdec6c746926828dba3c09dbe50ee9ee6551b533660dc8b1970df5395db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53d4fb5ac105dfbbf3b5d0ec727b9f8e51824d333bdd98edb0af5dad91dadace\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.935997 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7741636-b353-4cdb-b7e6-6e87e87b0438\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b94ab32a9d6ffdc5457309c86dfe2b93b58c43daaeb2d0d03b50f410af685232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb1bded7f24f0a45dd1a2f214b9daf75094b1b1a5f489b7ec62fea724eb723c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a757f09a2dd8d0db2b56a984e90a9f6a595cdef0036303ff590368654b89c034\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d814846ebaabfe35b4967883e1bdcb806ccd0837dcf2e485584e244db32a76c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b64776a47e041e0c41a2f1f02dc5f6f67568cfd5bf4955917ff031ea6c55a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c58728e27619291abf7d9f5a38052b64e90a18fda16c01db2bd24c84b8be514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f87d17ce366d96a667ca6eec6478c7a55414f5c1f5ade07d7155ebf7bc070764\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02b6853d8090bd47fb33c9c2cda5fbb55f2ea69838d318ff62d6500bb21e36a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.961237 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc8bca0b-3590-4748-8dc9-d659f09631bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:27:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 05:27:16.495290 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 05:27:16.495501 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 05:27:16.496655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1533552855/tls.crt::/tmp/serving-cert-1533552855/tls.key\\\\\\\"\\\\nI0308 05:27:17.103734 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 05:27:17.105836 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 05:27:17.105854 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 05:27:17.105885 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 05:27:17.105890 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 05:27:17.109800 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 05:27:17.109839 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109846 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 05:27:17.109852 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 05:27:17.109856 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 05:27:17.109860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 05:27:17.109864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 05:27:17.110242 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 05:27:17.112366 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.981105 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:13 crc kubenswrapper[4717]: I0308 05:29:13.999289 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://330356c76f8ca21605abfde6df85022001b611ffae7a459a93c190b9e47b3318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:13Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.018793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.040459 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6f7j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95c5996b-1216-4f9c-bc1f-0ca06f8de088\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:36Z\\\",\\\"message\\\":\\\"2026-03-08T05:27:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d\\\\n2026-03-08T05:27:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6e6b1f14-7dc3-4626-b7dc-8af82afe1c1d to /host/opt/cni/bin/\\\\n2026-03-08T05:27:51Z [verbose] multus-daemon started\\\\n2026-03-08T05:27:51Z [verbose] Readiness Indicator file check\\\\n2026-03-08T05:28:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgxsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6f7j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.060044 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3251be28-b6be-4fa0-9049-45adf172e9ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbda1669c30ca455ff4ed3930be610449c87d0d9f1307b10d4a7d504e012fbc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb05b33a09bae827e3b31a85f22c5ed72724dc00408ee137e20362ffbdc15b03\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T05:26:40Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0308 05:26:15.843562 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0308 05:26:15.847084 1 observer_polling.go:159] Starting file observer\\\\nI0308 05:26:15.888176 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0308 05:26:15.892894 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0308 05:26:40.129461 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0308 05:26:40.129719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://965e2bb740e7650731ec70e4fe62e2ad4798fc548ae492588ebdafa9afdda811\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4638c83f8d4fb756f1612a5c83168979dab480cf4c75c97d93a2544fe9820711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.078878 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b7eff32-3792-4924-a208-2581205a5f90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:26:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34622c54a101e225ee4c75628cc21f15006f15d3e5ffba8e722b9ccf452cec28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf774318dfd4a0371268eecf9b6a694c48f986d9fc048469b4a42bcbeb22abde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdc44b696ebe4b8ff1696d157b907f65ecfa7eb765cb7ed08bb17e5aa92d6a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:26:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43dc2eee110f4c747f8fdefac7de5e12155843cc866728497630ea5f8005722f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:26:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:26:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:26:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.094786 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qhwzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a6f4d53-3a88-4caa-b66c-3254cd82186b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8af286e3106d15de1213c1cab725c4e8bdda84e00f77330fe4bd1edb143fccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzbcs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qhwzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.111583 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6j4jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e996d1c-6f08-4f2d-a64b-e6f58300117d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2130447e78af3e354ad2eed77798c88445d2072b9f0ac494b2d0f23c16615b9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r82qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6j4jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.133452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://392599e4eb0a1955a088220c8b29fa86b2f94ef6c8419d467ef78ea75a2c0560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.153941 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c30234d6c840f1da2a5fb9ceda5902599a64322c09d2aca9251c471c07b3950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3af9f19d4f79b0092a961db6b55e0eeaefac78309eafa000a2e05f768a205e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.171227 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-d64q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrklm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-d64q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:14 crc kubenswrapper[4717]: I0308 05:29:14.205258 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b862036c-9fe5-43c3-87a4-9ff24595c456\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T05:27:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T05:28:40Z\\\",\\\"message\\\":\\\"ler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0308 05:28:40.874493 7177 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 05:28:40.874491 7177 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0308 05:28:40.874505 7177 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 05:28:40.874516 7177 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 05:28:40.874517 7177 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 05:28:40.874538 7177 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 05:28:40.874541 7177 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 05:28:40.874569 7177 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 05:28:40.874576 7177 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874629 7177 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0308 05:28:40.874698 7177 factory.go:656] Stopping watch factory\\\\nI0308 05:28:40.874715 7177 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 05:28:40.874724 7177 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0308 05:28:40.874761 7177 ovnkube.go:599] Stopped ovnkube\\\\nI0308 05:28:40.874922 7177 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0308 05:28:40.875022 7177 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T05:28:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fb27m_openshift-ovn-kubernetes(b862036c-9fe5-43c3-87a4-9ff24595c456)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T05:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T05:27:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T05:27:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6btr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T05:27:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fb27m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:14Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:15 crc kubenswrapper[4717]: I0308 05:29:15.781043 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:15 crc kubenswrapper[4717]: I0308 05:29:15.782121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:15 crc kubenswrapper[4717]: I0308 05:29:15.782123 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:15 crc kubenswrapper[4717]: I0308 05:29:15.782121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:15 crc kubenswrapper[4717]: E0308 05:29:15.782308 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:15 crc kubenswrapper[4717]: E0308 05:29:15.782465 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:15 crc kubenswrapper[4717]: E0308 05:29:15.782624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:15 crc kubenswrapper[4717]: E0308 05:29:15.782846 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.789063 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.789154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.789171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.789196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.789216 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:16Z","lastTransitionTime":"2026-03-08T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.814966 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:16Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.820721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.820792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.820810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.820832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.820847 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:16Z","lastTransitionTime":"2026-03-08T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.835754 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:16Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.840426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.840474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.840489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.840516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.840534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:16Z","lastTransitionTime":"2026-03-08T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.853380 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:16Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.858448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.858494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.858512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.858536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.858551 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:16Z","lastTransitionTime":"2026-03-08T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.877549 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:16Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.882577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.882610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.882641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.882661 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:16 crc kubenswrapper[4717]: I0308 05:29:16.882672 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:16Z","lastTransitionTime":"2026-03-08T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.901513 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T05:29:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d163e68-b679-4e39-be4b-f06654c828fa\\\",\\\"systemUUID\\\":\\\"19975744-be96-4ad0-8b81-d51bfb4105e2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T05:29:16Z is after 2025-08-24T17:21:41Z" Mar 08 05:29:16 crc kubenswrapper[4717]: E0308 05:29:16.901657 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 05:29:17 crc kubenswrapper[4717]: I0308 05:29:17.781760 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:17 crc kubenswrapper[4717]: I0308 05:29:17.781837 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:17 crc kubenswrapper[4717]: I0308 05:29:17.781844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:17 crc kubenswrapper[4717]: I0308 05:29:17.781907 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:17 crc kubenswrapper[4717]: E0308 05:29:17.782034 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:17 crc kubenswrapper[4717]: E0308 05:29:17.782263 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:17 crc kubenswrapper[4717]: E0308 05:29:17.782636 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:17 crc kubenswrapper[4717]: E0308 05:29:17.782858 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:18 crc kubenswrapper[4717]: E0308 05:29:18.896318 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:19 crc kubenswrapper[4717]: I0308 05:29:19.781462 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:19 crc kubenswrapper[4717]: I0308 05:29:19.781620 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:19 crc kubenswrapper[4717]: E0308 05:29:19.781842 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:19 crc kubenswrapper[4717]: I0308 05:29:19.781886 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:19 crc kubenswrapper[4717]: I0308 05:29:19.781958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:19 crc kubenswrapper[4717]: E0308 05:29:19.782220 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:19 crc kubenswrapper[4717]: E0308 05:29:19.782292 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:19 crc kubenswrapper[4717]: E0308 05:29:19.782396 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:21 crc kubenswrapper[4717]: I0308 05:29:21.781819 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:21 crc kubenswrapper[4717]: I0308 05:29:21.781955 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:21 crc kubenswrapper[4717]: E0308 05:29:21.782033 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:21 crc kubenswrapper[4717]: I0308 05:29:21.782066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:21 crc kubenswrapper[4717]: I0308 05:29:21.782072 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:21 crc kubenswrapper[4717]: E0308 05:29:21.782196 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:21 crc kubenswrapper[4717]: E0308 05:29:21.782415 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:21 crc kubenswrapper[4717]: E0308 05:29:21.782743 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:21 crc kubenswrapper[4717]: I0308 05:29:21.784311 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.805890 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d64q9"] Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.806450 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:22 crc kubenswrapper[4717]: E0308 05:29:22.806602 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.885249 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/1.log" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.886152 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/0.log" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.886209 4717 generic.go:334] "Generic (PLEG): container finished" podID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" containerID="251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6" exitCode=1 Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.886303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerDied","Data":"251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6"} Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.886652 4717 scope.go:117] "RemoveContainer" containerID="ead0546680ec3356b2f5da09ed80e1a89a8030f176173afebf8a7fd3662a62d8" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.887403 4717 scope.go:117] "RemoveContainer" containerID="251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6" Mar 08 05:29:22 crc kubenswrapper[4717]: E0308 05:29:22.887750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d6f7j_openshift-multus(95c5996b-1216-4f9c-bc1f-0ca06f8de088)\"" pod="openshift-multus/multus-d6f7j" podUID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.890531 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/3.log" Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.895761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerStarted","Data":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:29:22 crc kubenswrapper[4717]: I0308 05:29:22.896488 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.073927 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pkcrh" podStartSLOduration=130.073890272 podStartE2EDuration="2m10.073890272s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.073315297 +0000 UTC m=+189.990964171" watchObservedRunningTime="2026-03-08 05:29:23.073890272 +0000 UTC m=+189.991539156" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.074516 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podStartSLOduration=130.074505778 podStartE2EDuration="2m10.074505778s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.035373374 +0000 UTC m=+189.953022228" watchObservedRunningTime="2026-03-08 05:29:23.074505778 +0000 UTC m=+189.992154662" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.098877 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q67qb" podStartSLOduration=130.098829061 podStartE2EDuration="2m10.098829061s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.098125223 +0000 UTC m=+190.015774107" watchObservedRunningTime="2026-03-08 05:29:23.098829061 +0000 UTC m=+190.016477905" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.145709 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=28.145652762 podStartE2EDuration="28.145652762s" podCreationTimestamp="2026-03-08 05:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.14520167 +0000 UTC m=+190.062850514" watchObservedRunningTime="2026-03-08 05:29:23.145652762 +0000 UTC m=+190.063301646" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.176425 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=103.17639257 podStartE2EDuration="1m43.17639257s" podCreationTimestamp="2026-03-08 05:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.173961657 +0000 UTC m=+190.091610541" watchObservedRunningTime="2026-03-08 05:29:23.17639257 +0000 UTC m=+190.094041414" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.226485 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=109.226447173 podStartE2EDuration="1m49.226447173s" podCreationTimestamp="2026-03-08 05:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.198619259 +0000 UTC m=+190.116268133" watchObservedRunningTime="2026-03-08 05:29:23.226447173 +0000 UTC m=+190.144096057" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.284768 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.284739697 podStartE2EDuration="1m6.284739697s" podCreationTimestamp="2026-03-08 05:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.284224054 +0000 UTC m=+190.201872908" watchObservedRunningTime="2026-03-08 05:29:23.284739697 +0000 UTC m=+190.202388571" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.304417 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.304395721 podStartE2EDuration="50.304395721s" podCreationTimestamp="2026-03-08 05:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.300570863 +0000 UTC m=+190.218219707" watchObservedRunningTime="2026-03-08 05:29:23.304395721 +0000 UTC m=+190.222044595" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.316081 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qhwzg" podStartSLOduration=130.31604417 podStartE2EDuration="2m10.31604417s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.315002623 +0000 UTC m=+190.232651507" watchObservedRunningTime="2026-03-08 05:29:23.31604417 +0000 UTC m=+190.233693044" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.336078 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6j4jn" podStartSLOduration=130.336037372 podStartE2EDuration="2m10.336037372s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.335444427 +0000 UTC m=+190.253093281" watchObservedRunningTime="2026-03-08 05:29:23.336037372 +0000 UTC m=+190.253686226" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.382551 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podStartSLOduration=130.382523954 podStartE2EDuration="2m10.382523954s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:23.381755224 +0000 UTC m=+190.299404078" watchObservedRunningTime="2026-03-08 05:29:23.382523954 +0000 UTC m=+190.300172798" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.781087 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:23 crc kubenswrapper[4717]: E0308 05:29:23.783190 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.783801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.783802 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:23 crc kubenswrapper[4717]: E0308 05:29:23.783947 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:23 crc kubenswrapper[4717]: E0308 05:29:23.784135 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:23 crc kubenswrapper[4717]: E0308 05:29:23.897400 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:23 crc kubenswrapper[4717]: I0308 05:29:23.901668 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/1.log" Mar 08 05:29:24 crc kubenswrapper[4717]: I0308 05:29:24.781839 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:24 crc kubenswrapper[4717]: E0308 05:29:24.782732 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:25 crc kubenswrapper[4717]: I0308 05:29:25.781254 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:25 crc kubenswrapper[4717]: I0308 05:29:25.781363 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:25 crc kubenswrapper[4717]: E0308 05:29:25.781470 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:25 crc kubenswrapper[4717]: I0308 05:29:25.781359 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:25 crc kubenswrapper[4717]: E0308 05:29:25.781579 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:25 crc kubenswrapper[4717]: E0308 05:29:25.781665 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:26 crc kubenswrapper[4717]: I0308 05:29:26.781431 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:26 crc kubenswrapper[4717]: E0308 05:29:26.781725 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.072332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.072408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.072457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.072487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.072508 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T05:29:27Z","lastTransitionTime":"2026-03-08T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.145660 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm"] Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.146405 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.149586 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.150747 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.150952 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.152332 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.313643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.313790 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88873014-2da1-408a-8c39-545528beb3c5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.313895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88873014-2da1-408a-8c39-545528beb3c5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.313952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.313988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88873014-2da1-408a-8c39-545528beb3c5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416047 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88873014-2da1-408a-8c39-545528beb3c5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88873014-2da1-408a-8c39-545528beb3c5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416280 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416324 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88873014-2da1-408a-8c39-545528beb3c5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.416401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88873014-2da1-408a-8c39-545528beb3c5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.418050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88873014-2da1-408a-8c39-545528beb3c5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.429101 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88873014-2da1-408a-8c39-545528beb3c5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.451225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88873014-2da1-408a-8c39-545528beb3c5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t7xsm\" (UID: \"88873014-2da1-408a-8c39-545528beb3c5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.482492 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.781078 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.781153 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.781240 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:27 crc kubenswrapper[4717]: E0308 05:29:27.781355 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:27 crc kubenswrapper[4717]: E0308 05:29:27.781524 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:27 crc kubenswrapper[4717]: E0308 05:29:27.781744 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.832318 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.846132 4717 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.928064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" event={"ID":"88873014-2da1-408a-8c39-545528beb3c5","Type":"ContainerStarted","Data":"b2d5b6f5bb2660442d5c0101b584349be6e9aacf676a090edc7bec9b7886d818"} Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.928148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" event={"ID":"88873014-2da1-408a-8c39-545528beb3c5","Type":"ContainerStarted","Data":"91c8f9a0e5f09925d46e8093aa5660172ee114c095c71e9cb91df180ffe7ed9c"} Mar 08 05:29:27 crc kubenswrapper[4717]: I0308 05:29:27.955543 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t7xsm" podStartSLOduration=134.955506264 podStartE2EDuration="2m14.955506264s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:27.953739909 +0000 UTC m=+194.871388783" watchObservedRunningTime="2026-03-08 05:29:27.955506264 +0000 UTC m=+194.873155118" Mar 08 05:29:28 crc kubenswrapper[4717]: I0308 05:29:28.781289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:28 crc kubenswrapper[4717]: E0308 05:29:28.781540 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:28 crc kubenswrapper[4717]: E0308 05:29:28.899171 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:29 crc kubenswrapper[4717]: I0308 05:29:29.781631 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:29 crc kubenswrapper[4717]: I0308 05:29:29.781714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:29 crc kubenswrapper[4717]: E0308 05:29:29.781914 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:29 crc kubenswrapper[4717]: I0308 05:29:29.781960 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:29 crc kubenswrapper[4717]: E0308 05:29:29.782077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:29 crc kubenswrapper[4717]: E0308 05:29:29.782334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:30 crc kubenswrapper[4717]: I0308 05:29:30.780880 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:30 crc kubenswrapper[4717]: E0308 05:29:30.781117 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:31 crc kubenswrapper[4717]: I0308 05:29:31.781091 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:31 crc kubenswrapper[4717]: I0308 05:29:31.781131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:31 crc kubenswrapper[4717]: I0308 05:29:31.781188 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:31 crc kubenswrapper[4717]: E0308 05:29:31.781316 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:31 crc kubenswrapper[4717]: E0308 05:29:31.781449 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:31 crc kubenswrapper[4717]: E0308 05:29:31.781792 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:32 crc kubenswrapper[4717]: I0308 05:29:32.782130 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:32 crc kubenswrapper[4717]: E0308 05:29:32.783041 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:33 crc kubenswrapper[4717]: I0308 05:29:33.781664 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:33 crc kubenswrapper[4717]: I0308 05:29:33.781835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:33 crc kubenswrapper[4717]: I0308 05:29:33.781836 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:33 crc kubenswrapper[4717]: E0308 05:29:33.784151 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:33 crc kubenswrapper[4717]: E0308 05:29:33.784294 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:33 crc kubenswrapper[4717]: E0308 05:29:33.784406 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:33 crc kubenswrapper[4717]: E0308 05:29:33.900242 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:29:34 crc kubenswrapper[4717]: I0308 05:29:34.210846 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:29:34 crc kubenswrapper[4717]: I0308 05:29:34.781450 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:34 crc kubenswrapper[4717]: E0308 05:29:34.781653 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:35 crc kubenswrapper[4717]: I0308 05:29:35.781145 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:35 crc kubenswrapper[4717]: I0308 05:29:35.781197 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:35 crc kubenswrapper[4717]: I0308 05:29:35.781146 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:35 crc kubenswrapper[4717]: E0308 05:29:35.781368 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:35 crc kubenswrapper[4717]: E0308 05:29:35.781519 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:35 crc kubenswrapper[4717]: E0308 05:29:35.781617 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:36 crc kubenswrapper[4717]: I0308 05:29:36.781046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:36 crc kubenswrapper[4717]: E0308 05:29:36.781724 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:36 crc kubenswrapper[4717]: I0308 05:29:36.782269 4717 scope.go:117] "RemoveContainer" containerID="251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6" Mar 08 05:29:36 crc kubenswrapper[4717]: I0308 05:29:36.973178 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/1.log" Mar 08 05:29:36 crc kubenswrapper[4717]: I0308 05:29:36.973354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerStarted","Data":"3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb"} Mar 08 05:29:37 crc kubenswrapper[4717]: I0308 05:29:37.003268 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d6f7j" podStartSLOduration=144.003229705 podStartE2EDuration="2m24.003229705s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:37.003109992 +0000 UTC m=+203.920758876" watchObservedRunningTime="2026-03-08 05:29:37.003229705 +0000 UTC m=+203.920878589" Mar 08 05:29:37 crc kubenswrapper[4717]: I0308 05:29:37.781568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:37 crc kubenswrapper[4717]: I0308 05:29:37.781649 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:37 crc kubenswrapper[4717]: I0308 05:29:37.781854 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:37 crc kubenswrapper[4717]: E0308 05:29:37.782028 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 05:29:37 crc kubenswrapper[4717]: E0308 05:29:37.782195 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 05:29:37 crc kubenswrapper[4717]: E0308 05:29:37.782509 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 05:29:38 crc kubenswrapper[4717]: I0308 05:29:38.781210 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:38 crc kubenswrapper[4717]: E0308 05:29:38.781483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d64q9" podUID="dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.781573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.781651 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.782016 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.785349 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.785538 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.785664 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 05:29:39 crc kubenswrapper[4717]: I0308 05:29:39.786193 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 05:29:40 crc kubenswrapper[4717]: I0308 05:29:40.781105 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:40 crc kubenswrapper[4717]: I0308 05:29:40.786380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 05:29:40 crc kubenswrapper[4717]: I0308 05:29:40.786878 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.619390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:41 crc kubenswrapper[4717]: E0308 05:29:41.619594 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:31:43.619561346 +0000 UTC m=+330.537210200 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.619660 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.619739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.621414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.630710 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.822164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.822249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.822304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.828155 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.828277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7-metrics-certs\") pod \"network-metrics-daemon-d64q9\" (UID: \"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7\") " pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.830045 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.908456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.923180 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 05:29:41 crc kubenswrapper[4717]: I0308 05:29:41.943014 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:42 crc kubenswrapper[4717]: I0308 05:29:42.008469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d64q9" Mar 08 05:29:42 crc kubenswrapper[4717]: W0308 05:29:42.275824 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-2a1b1933fb9c94c86487849cb3585c9567f2534f899c79b31b3ee0c9539edfe1 WatchSource:0}: Error finding container 2a1b1933fb9c94c86487849cb3585c9567f2534f899c79b31b3ee0c9539edfe1: Status 404 returned error can't find the container with id 2a1b1933fb9c94c86487849cb3585c9567f2534f899c79b31b3ee0c9539edfe1 Mar 08 05:29:42 crc kubenswrapper[4717]: W0308 05:29:42.276195 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0f4f0691956ab94756c1a62ee0218d368e7d3341021f1a25860c976bfc70f5f4 WatchSource:0}: Error finding container 0f4f0691956ab94756c1a62ee0218d368e7d3341021f1a25860c976bfc70f5f4: Status 404 returned error can't find the container with id 0f4f0691956ab94756c1a62ee0218d368e7d3341021f1a25860c976bfc70f5f4 Mar 08 05:29:42 crc kubenswrapper[4717]: W0308 05:29:42.297232 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-769c4ec8c948225445c289ad23c61c530dbac3c8e542b48ba29a8c83574f93c0 WatchSource:0}: Error finding container 769c4ec8c948225445c289ad23c61c530dbac3c8e542b48ba29a8c83574f93c0: Status 404 returned error can't find the container with id 769c4ec8c948225445c289ad23c61c530dbac3c8e542b48ba29a8c83574f93c0 Mar 08 05:29:42 crc kubenswrapper[4717]: I0308 05:29:42.315737 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d64q9"] Mar 08 05:29:42 crc kubenswrapper[4717]: W0308 05:29:42.324608 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee8e0ba_d925_47d5_bad6_0fe9fdc53ef7.slice/crio-e4b9b42f35385ad0c2ef719971f717dff48fda2a4e49ead3ebb71d9b1ed9b56e WatchSource:0}: Error finding container e4b9b42f35385ad0c2ef719971f717dff48fda2a4e49ead3ebb71d9b1ed9b56e: Status 404 returned error can't find the container with id e4b9b42f35385ad0c2ef719971f717dff48fda2a4e49ead3ebb71d9b1ed9b56e Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.007590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4decedb6359c7a5732f5bf4f8ea3de3f87818a7d8e8a88150e8a01c144ed0c63"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.007717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0f4f0691956ab94756c1a62ee0218d368e7d3341021f1a25860c976bfc70f5f4"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.010436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d64q9" event={"ID":"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7","Type":"ContainerStarted","Data":"6cc770a49059217bace0a9c71046289f7ac85c8019288652102b61607d489530"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.010551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d64q9" event={"ID":"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7","Type":"ContainerStarted","Data":"a5bdee464fd4e2e63c761fd777b5af4ffb9dc1ba188804e084eaf9d54b2f702e"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.010581 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d64q9" event={"ID":"dee8e0ba-d925-47d5-bad6-0fe9fdc53ef7","Type":"ContainerStarted","Data":"e4b9b42f35385ad0c2ef719971f717dff48fda2a4e49ead3ebb71d9b1ed9b56e"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.012404 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1259306f07ce075c5523341580b4ea36c95ba1370b67c2514227324d568a07f7"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.012462 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2a1b1933fb9c94c86487849cb3585c9567f2534f899c79b31b3ee0c9539edfe1"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.014551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"01143178dc226fe058b3e328b542e44839a54c1226b08630c22a09a8d64a029b"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.014613 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"769c4ec8c948225445c289ad23c61c530dbac3c8e542b48ba29a8c83574f93c0"} Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.014872 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:29:43 crc kubenswrapper[4717]: I0308 05:29:43.123149 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d64q9" podStartSLOduration=150.123117961 podStartE2EDuration="2m30.123117961s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:43.122050803 +0000 UTC m=+210.039699677" watchObservedRunningTime="2026-03-08 05:29:43.123117961 +0000 UTC m=+210.040766815" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.627848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.697330 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cw7dl"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.698447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.700114 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.702243 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.702277 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.703479 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.705334 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.706039 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.706629 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.711802 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.712590 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.712851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.714081 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ppg2t"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.733458 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jvkgl"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.734556 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.735466 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.740182 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.741447 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.742601 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.742762 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.742777 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.743433 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.756308 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.756510 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.756979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.757185 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.757278 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.757586 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.757822 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.757959 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758097 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758367 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758394 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758535 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758620 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758786 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.758984 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.759274 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.759898 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.760130 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.760123 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.760422 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.761288 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.761536 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.762210 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.764587 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.764833 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.765080 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.765365 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.765576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767088 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767266 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767414 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767504 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767626 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767909 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.767945 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768058 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768156 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768201 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768438 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768535 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768646 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768777 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768925 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.769024 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.769115 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.769121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.768580 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.769584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.770576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.770703 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.771281 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5mwq"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.771517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.771749 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.771996 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.772619 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.773509 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.781958 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.782254 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.782445 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.782661 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.782752 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tthm"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.783347 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.783569 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.783648 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.783703 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.783966 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.784023 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.784090 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.784180 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.784964 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.785005 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.785134 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.785257 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.785343 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.788773 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.797841 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.798391 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.801360 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.801946 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hgxsr"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.802231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.802258 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.802618 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-bdgwk"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.802664 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.814839 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.815527 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.817761 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.819705 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8lcq6"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.820085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.820657 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.821756 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.830527 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.859717 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.860224 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.860539 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.859732 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.860875 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.861003 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.861036 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.861280 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.861306 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.861556 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.862765 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ppg2t"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.863835 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.864714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.876745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.876805 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smkdd"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.877545 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.889244 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.890296 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.890584 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.891354 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.891662 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.892055 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.892261 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.892395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901703 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a4bc63-d223-4305-abe4-a9a259db716d-serving-cert\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901739 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901759 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-node-pullsecrets\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901775 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-serving-cert\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e73a19-e7c1-4504-8499-4566b10f2682-service-ca-bundle\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-auth-proxy-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901829 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-encryption-config\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901850 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901868 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-client\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6pw\" (UniqueName: \"kubernetes.io/projected/d726a0f6-4858-4e71-8513-75c63f0bfb8d-kube-api-access-mz6pw\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901918 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd6302fd-1260-4793-9d9b-2dbfba20a013-tmpfs\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901950 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvj8\" (UniqueName: \"kubernetes.io/projected/73c6f272-9791-479e-8dde-b761d5da5b75-kube-api-access-nsvj8\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kh5\" (UniqueName: \"kubernetes.io/projected/11bd956f-1b8e-461d-b42b-50f1b7417607-kube-api-access-t7kh5\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.901996 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvn4\" (UniqueName: \"kubernetes.io/projected/f8e73a19-e7c1-4504-8499-4566b10f2682-kube-api-access-9nvn4\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902065 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-srv-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902082 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-encryption-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902148 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-audit-dir\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e41d744-e947-4c09-be5d-343f5a6d2bd1-machine-approver-tls\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902195 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-serving-cert\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-client\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902230 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f71b5e8a-6657-41d6-a447-ca755016bed2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clhsv\" (UniqueName: \"kubernetes.io/projected/94a4bc63-d223-4305-abe4-a9a259db716d-kube-api-access-clhsv\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902267 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2cd\" (UniqueName: \"kubernetes.io/projected/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-kube-api-access-4l2cd\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902285 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-serving-cert\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-default-certificate\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902356 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/94a4bc63-d223-4305-abe4-a9a259db716d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-config\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902420 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73c6f272-9791-479e-8dde-b761d5da5b75-audit-dir\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902436 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902454 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhvm\" (UniqueName: \"kubernetes.io/projected/8a32992d-6f35-4172-8081-9f64b078e2b3-kube-api-access-lwhvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddsw\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-kube-api-access-kddsw\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902487 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/341123d7-044c-40c6-85bc-7a1685c07046-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902502 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-stats-auth\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902522 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-metrics-certs\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902540 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-audit\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-images\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/341123d7-044c-40c6-85bc-7a1685c07046-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902604 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frv6\" (UniqueName: \"kubernetes.io/projected/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-kube-api-access-6frv6\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a32992d-6f35-4172-8081-9f64b078e2b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-images\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-serving-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902672 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-image-import-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f7b6ea6-9a3d-432d-a034-956e93323452-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvzc\" (UniqueName: \"kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg98z\" (UniqueName: \"kubernetes.io/projected/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-kube-api-access-tg98z\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902799 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c2b864-898a-4b84-a3ab-168051c21e34-metrics-tls\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902851 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902867 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dr72\" (UniqueName: \"kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-config\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902899 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7b6ea6-9a3d-432d-a034-956e93323452-config\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902915 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-apiservice-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902932 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-trusted-ca\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2xq\" (UniqueName: \"kubernetes.io/projected/4ca60946-75d5-469e-84f0-d200ca8c0cfd-kube-api-access-lw2xq\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.902983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-proxy-tls\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903000 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7b6ea6-9a3d-432d-a034-956e93323452-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903030 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-webhook-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d726a0f6-4858-4e71-8513-75c63f0bfb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903082 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903097 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32992d-6f35-4172-8081-9f64b078e2b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903113 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/92f72a27-7281-40cc-89e0-e0424b81c21d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903130 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903147 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-audit-policies\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzfr5\" (UniqueName: \"kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkq44\" (UniqueName: \"kubernetes.io/projected/f2c2b864-898a-4b84-a3ab-168051c21e34-kube-api-access-xkq44\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f71b5e8a-6657-41d6-a447-ca755016bed2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903264 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-serving-cert\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5gv\" (UniqueName: \"kubernetes.io/projected/1e41d744-e947-4c09-be5d-343f5a6d2bd1-kube-api-access-qq5gv\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca60946-75d5-469e-84f0-d200ca8c0cfd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903328 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmj5\" (UniqueName: \"kubernetes.io/projected/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-kube-api-access-krmj5\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903345 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/341123d7-044c-40c6-85bc-7a1685c07046-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903393 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8bmn\" (UniqueName: \"kubernetes.io/projected/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-kube-api-access-x8bmn\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903428 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gln\" (UniqueName: \"kubernetes.io/projected/fd6302fd-1260-4793-9d9b-2dbfba20a013-kube-api-access-l8gln\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903443 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96j2s\" (UniqueName: \"kubernetes.io/projected/92f72a27-7281-40cc-89e0-e0424b81c21d-kube-api-access-96j2s\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-config\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.903478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.904466 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.905067 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hp5l5"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.905457 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.906256 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.924630 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cw7dl"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.939810 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.939885 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549128-gnphx"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.940765 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.941138 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.941242 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.941483 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.942657 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.944204 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.945612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.950587 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.951661 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.952604 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.953591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.955159 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.958517 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.959389 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.959672 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.962535 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.971280 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.971621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.971662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972035 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972106 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972242 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972040 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972314 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972410 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972563 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972655 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972722 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.972937 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973141 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973233 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973460 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973553 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973652 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973771 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.973863 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.975766 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.975875 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.976085 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc"] Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.982024 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.982132 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.982243 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.982725 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.982899 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 05:29:47 crc kubenswrapper[4717]: I0308 05:29:47.983057 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.005847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb782ddc-5c1b-4352-9c44-d8ada04559e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.005893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb782ddc-5c1b-4352-9c44-d8ada04559e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.005937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-image-import-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.005974 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f7b6ea6-9a3d-432d-a034-956e93323452-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvzc\" (UniqueName: \"kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg98z\" (UniqueName: \"kubernetes.io/projected/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-kube-api-access-tg98z\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006427 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006461 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c2b864-898a-4b84-a3ab-168051c21e34-metrics-tls\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dr72\" (UniqueName: \"kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-config\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7b6ea6-9a3d-432d-a034-956e93323452-config\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006628 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-apiservice-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-trusted-ca\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006757 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2xq\" (UniqueName: \"kubernetes.io/projected/4ca60946-75d5-469e-84f0-d200ca8c0cfd-kube-api-access-lw2xq\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-proxy-tls\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99vz\" (UniqueName: \"kubernetes.io/projected/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-kube-api-access-b99vz\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7b6ea6-9a3d-432d-a034-956e93323452-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.006985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-webhook-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007010 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007093 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d726a0f6-4858-4e71-8513-75c63f0bfb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32992d-6f35-4172-8081-9f64b078e2b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/92f72a27-7281-40cc-89e0-e0424b81c21d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007279 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-audit-policies\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzfr5\" (UniqueName: \"kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkq44\" (UniqueName: \"kubernetes.io/projected/f2c2b864-898a-4b84-a3ab-168051c21e34-kube-api-access-xkq44\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f71b5e8a-6657-41d6-a447-ca755016bed2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.007593 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-serving-cert\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008092 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5gv\" (UniqueName: \"kubernetes.io/projected/1e41d744-e947-4c09-be5d-343f5a6d2bd1-kube-api-access-qq5gv\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca60946-75d5-469e-84f0-d200ca8c0cfd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krmj5\" (UniqueName: \"kubernetes.io/projected/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-kube-api-access-krmj5\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008208 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/341123d7-044c-40c6-85bc-7a1685c07046-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008236 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-profile-collector-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008288 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8bmn\" (UniqueName: \"kubernetes.io/projected/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-kube-api-access-x8bmn\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gln\" (UniqueName: \"kubernetes.io/projected/fd6302fd-1260-4793-9d9b-2dbfba20a013-kube-api-access-l8gln\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96j2s\" (UniqueName: \"kubernetes.io/projected/92f72a27-7281-40cc-89e0-e0424b81c21d-kube-api-access-96j2s\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-config\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008482 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a4bc63-d223-4305-abe4-a9a259db716d-serving-cert\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008551 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-node-pullsecrets\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008573 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-serving-cert\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e73a19-e7c1-4504-8499-4566b10f2682-service-ca-bundle\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008622 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-auth-proxy-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-encryption-config\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008802 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008827 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-client\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6pw\" (UniqueName: \"kubernetes.io/projected/d726a0f6-4858-4e71-8513-75c63f0bfb8d-kube-api-access-mz6pw\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd6302fd-1260-4793-9d9b-2dbfba20a013-tmpfs\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.008979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009003 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvj8\" (UniqueName: \"kubernetes.io/projected/73c6f272-9791-479e-8dde-b761d5da5b75-kube-api-access-nsvj8\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kh5\" (UniqueName: \"kubernetes.io/projected/11bd956f-1b8e-461d-b42b-50f1b7417607-kube-api-access-t7kh5\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvn4\" (UniqueName: \"kubernetes.io/projected/f8e73a19-e7c1-4504-8499-4566b10f2682-kube-api-access-9nvn4\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-srv-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009263 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009306 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-encryption-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-audit-dir\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009462 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e41d744-e947-4c09-be5d-343f5a6d2bd1-machine-approver-tls\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009479 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-serving-cert\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-client\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zg7z\" (UniqueName: \"kubernetes.io/projected/6bb8daf7-8d77-44c2-ab01-b02257d17ac9-kube-api-access-8zg7z\") pod \"downloads-7954f5f757-hgxsr\" (UID: \"6bb8daf7-8d77-44c2-ab01-b02257d17ac9\") " pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009551 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f71b5e8a-6657-41d6-a447-ca755016bed2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009578 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clhsv\" (UniqueName: \"kubernetes.io/projected/94a4bc63-d223-4305-abe4-a9a259db716d-kube-api-access-clhsv\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009599 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2cd\" (UniqueName: \"kubernetes.io/projected/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-kube-api-access-4l2cd\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-serving-cert\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009735 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009780 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-default-certificate\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/94a4bc63-d223-4305-abe4-a9a259db716d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.010395 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.011464 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-image-import-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.009833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4tw\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-kube-api-access-fj4tw\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.011898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs8lw\" (UniqueName: \"kubernetes.io/projected/915a9413-e8fc-4441-b80a-f2a24186ad76-kube-api-access-cs8lw\") pod \"migrator-59844c95c7-smfpz\" (UID: \"915a9413-e8fc-4441-b80a-f2a24186ad76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.011929 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.011956 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.011993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-config\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73c6f272-9791-479e-8dde-b761d5da5b75-audit-dir\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhvm\" (UniqueName: \"kubernetes.io/projected/8a32992d-6f35-4172-8081-9f64b078e2b3-kube-api-access-lwhvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddsw\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-kube-api-access-kddsw\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012126 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/341123d7-044c-40c6-85bc-7a1685c07046-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012147 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-stats-auth\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-metrics-certs\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-audit\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012228 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-images\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/341123d7-044c-40c6-85bc-7a1685c07046-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frv6\" (UniqueName: \"kubernetes.io/projected/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-kube-api-access-6frv6\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a32992d-6f35-4172-8081-9f64b078e2b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012313 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-images\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012335 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-serving-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.012676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.013921 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.016171 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tthm"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.016907 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-audit-policies\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.017418 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.018137 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.020475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.021171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7b6ea6-9a3d-432d-a034-956e93323452-config\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.021302 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-config\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.021228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.021919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2c2b864-898a-4b84-a3ab-168051c21e34-metrics-tls\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.022151 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-serving-ca\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.025158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.023142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.028261 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-serving-cert\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.037498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.037629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7b6ea6-9a3d-432d-a034-956e93323452-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.038532 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.040204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-etcd-client\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.040296 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.040339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lvb9t"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.040445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-trusted-ca\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.040570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.042720 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.043573 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.045077 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-service-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.045268 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.045900 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d726a0f6-4858-4e71-8513-75c63f0bfb8d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.046059 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.046371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-audit-dir\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.046886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.047084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e73a19-e7c1-4504-8499-4566b10f2682-service-ca-bundle\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.047273 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.047384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.047416 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.047587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.048419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4ca60946-75d5-469e-84f0-d200ca8c0cfd-images\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.048664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.050298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-serving-cert\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.050651 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.050876 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-encryption-config\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051278 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-config\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/11bd956f-1b8e-461d-b42b-50f1b7417607-node-pullsecrets\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051612 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a4bc63-d223-4305-abe4-a9a259db716d-serving-cert\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f71b5e8a-6657-41d6-a447-ca755016bed2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.051961 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.052304 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fd6302fd-1260-4793-9d9b-2dbfba20a013-tmpfs\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.052459 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-metrics-certs\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73c6f272-9791-479e-8dde-b761d5da5b75-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053393 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/11bd956f-1b8e-461d-b42b-50f1b7417607-audit\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.053806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-etcd-client\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.052029 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.054216 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.054368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.054368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/94a4bc63-d223-4305-abe4-a9a259db716d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.054702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73c6f272-9791-479e-8dde-b761d5da5b75-serving-cert\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.054785 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.055578 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.055976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4ca60946-75d5-469e-84f0-d200ca8c0cfd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.056018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73c6f272-9791-479e-8dde-b761d5da5b75-audit-dir\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.056125 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.056771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.057588 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.058840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/11bd956f-1b8e-461d-b42b-50f1b7417607-encryption-config\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.059175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.059401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-serving-cert\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.059609 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.059662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.060476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.061013 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1e41d744-e947-4c09-be5d-343f5a6d2bd1-machine-approver-tls\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.062122 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.062318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.062621 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.065541 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.067000 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.067181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-stats-auth\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.068435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.069889 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5mwq"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.071232 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.071786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.073626 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.075305 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hgxsr"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.076465 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.077873 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8lcq6"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.079260 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.080651 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.083051 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.085048 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jvkgl"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.085104 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5d44b"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.085965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.086137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.088364 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-szjtn"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.090016 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.090198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.093583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e41d744-e947-4c09-be5d-343f5a6d2bd1-auth-proxy-config\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.093616 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.094952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-config\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.095385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.096651 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.101173 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smkdd"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.102647 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.103803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.105515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.108178 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549128-gnphx"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.112374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8e73a19-e7c1-4504-8499-4566b10f2682-default-certificate\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.113171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb782ddc-5c1b-4352-9c44-d8ada04559e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.113193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb782ddc-5c1b-4352-9c44-d8ada04559e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.113399 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b99vz\" (UniqueName: \"kubernetes.io/projected/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-kube-api-access-b99vz\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.113646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.113972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-profile-collector-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.114238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zg7z\" (UniqueName: \"kubernetes.io/projected/6bb8daf7-8d77-44c2-ab01-b02257d17ac9-kube-api-access-8zg7z\") pod \"downloads-7954f5f757-hgxsr\" (UID: \"6bb8daf7-8d77-44c2-ab01-b02257d17ac9\") " pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.114315 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4tw\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-kube-api-access-fj4tw\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.114337 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs8lw\" (UniqueName: \"kubernetes.io/projected/915a9413-e8fc-4441-b80a-f2a24186ad76-kube-api-access-cs8lw\") pod \"migrator-59844c95c7-smfpz\" (UID: \"915a9413-e8fc-4441-b80a-f2a24186ad76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.120181 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.127950 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5d44b"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.143007 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hp5l5"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.144888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.147962 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvb9t"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.151231 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.151265 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.151279 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.152186 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.152921 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6jlgw"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.153906 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-szjtn"] Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.154030 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.160207 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.180677 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.185942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-images\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.200621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.221125 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.227624 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-proxy-tls\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.241321 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.261337 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.280985 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.301786 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.322050 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.341087 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.361637 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.381557 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.401760 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.421021 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.431156 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32992d-6f35-4172-8081-9f64b078e2b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.444181 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.448771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a32992d-6f35-4172-8081-9f64b078e2b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.462919 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.481586 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.501300 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.522764 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.542312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.553952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-srv-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.562085 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.569065 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-profile-collector-cert\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.581980 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.587134 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-webhook-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.597167 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd6302fd-1260-4793-9d9b-2dbfba20a013-apiservice-cert\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.602575 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.622216 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.641552 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.661898 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.681455 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.701664 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.723241 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.730632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb782ddc-5c1b-4352-9c44-d8ada04559e0-metrics-tls\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.742410 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.761357 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.781605 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.812252 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.816224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb782ddc-5c1b-4352-9c44-d8ada04559e0-trusted-ca\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.821775 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.835970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/341123d7-044c-40c6-85bc-7a1685c07046-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.842222 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.851650 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/341123d7-044c-40c6-85bc-7a1685c07046-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.864152 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.870164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/92f72a27-7281-40cc-89e0-e0424b81c21d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.879754 4717 request.go:700] Waited for 1.001897738s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.882801 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.901917 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.922733 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.941788 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.962944 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 05:29:48 crc kubenswrapper[4717]: I0308 05:29:48.981111 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.001562 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.021931 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.031527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f71b5e8a-6657-41d6-a447-ca755016bed2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.042654 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.062885 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.102837 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.123055 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.141208 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.163191 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.182741 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.201102 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.221265 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.242003 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.262437 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.281644 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.301388 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.321635 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.342277 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.362679 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.393581 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.401560 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.422067 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.441942 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.461657 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.483115 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.502519 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.565493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzfr5\" (UniqueName: \"kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5\") pod \"oauth-openshift-558db77b4-mxg8l\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.592142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkq44\" (UniqueName: \"kubernetes.io/projected/f2c2b864-898a-4b84-a3ab-168051c21e34-kube-api-access-xkq44\") pod \"dns-operator-744455d44c-c5mwq\" (UID: \"f2c2b864-898a-4b84-a3ab-168051c21e34\") " pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.609942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dr72\" (UniqueName: \"kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72\") pod \"console-f9d7485db-zhsbw\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.633322 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg98z\" (UniqueName: \"kubernetes.io/projected/bb0868af-d764-41ec-a4f8-0d7086fbb1cc-kube-api-access-tg98z\") pod \"openshift-apiserver-operator-796bbdcf4f-67vch\" (UID: \"bb0868af-d764-41ec-a4f8-0d7086fbb1cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.650493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f7b6ea6-9a3d-432d-a034-956e93323452-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qj6pl\" (UID: \"4f7b6ea6-9a3d-432d-a034-956e93323452\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.662674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2xq\" (UniqueName: \"kubernetes.io/projected/4ca60946-75d5-469e-84f0-d200ca8c0cfd-kube-api-access-lw2xq\") pod \"machine-api-operator-5694c8668f-ppg2t\" (UID: \"4ca60946-75d5-469e-84f0-d200ca8c0cfd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.690184 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvn4\" (UniqueName: \"kubernetes.io/projected/f8e73a19-e7c1-4504-8499-4566b10f2682-kube-api-access-9nvn4\") pod \"router-default-5444994796-bdgwk\" (UID: \"f8e73a19-e7c1-4504-8499-4566b10f2682\") " pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.707477 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/341123d7-044c-40c6-85bc-7a1685c07046-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c4ppr\" (UID: \"341123d7-044c-40c6-85bc-7a1685c07046\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.716111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.728507 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhvm\" (UniqueName: \"kubernetes.io/projected/8a32992d-6f35-4172-8081-9f64b078e2b3-kube-api-access-lwhvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-cjzqs\" (UID: \"8a32992d-6f35-4172-8081-9f64b078e2b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.743339 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.748805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gln\" (UniqueName: \"kubernetes.io/projected/fd6302fd-1260-4793-9d9b-2dbfba20a013-kube-api-access-l8gln\") pod \"packageserver-d55dfcdfc-4l75s\" (UID: \"fd6302fd-1260-4793-9d9b-2dbfba20a013\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.753814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.762483 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.772017 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.783737 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.799233 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.803130 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.845100 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frv6\" (UniqueName: \"kubernetes.io/projected/7edd6b0f-e7fd-4133-9d1e-8a7b7356b077-kube-api-access-6frv6\") pod \"authentication-operator-69f744f599-5tthm\" (UID: \"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.848199 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kh5\" (UniqueName: \"kubernetes.io/projected/11bd956f-1b8e-461d-b42b-50f1b7417607-kube-api-access-t7kh5\") pod \"apiserver-76f77b778f-cw7dl\" (UID: \"11bd956f-1b8e-461d-b42b-50f1b7417607\") " pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.866931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddsw\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-kube-api-access-kddsw\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.882113 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.890170 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvzc\" (UniqueName: \"kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc\") pod \"controller-manager-879f6c89f-hwxhw\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.897159 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.897855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6pw\" (UniqueName: \"kubernetes.io/projected/d726a0f6-4858-4e71-8513-75c63f0bfb8d-kube-api-access-mz6pw\") pod \"cluster-samples-operator-665b6dd947-ctr2b\" (UID: \"d726a0f6-4858-4e71-8513-75c63f0bfb8d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.899503 4717 request.go:700] Waited for 1.847057821s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.917933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f71b5e8a-6657-41d6-a447-ca755016bed2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7v6pp\" (UID: \"f71b5e8a-6657-41d6-a447-ca755016bed2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.936332 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.942608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2cd\" (UniqueName: \"kubernetes.io/projected/13a322b9-ab5c-44e7-bcda-9b05a3ef2f16-kube-api-access-4l2cd\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mhq2\" (UID: \"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.948262 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.959274 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96j2s\" (UniqueName: \"kubernetes.io/projected/92f72a27-7281-40cc-89e0-e0424b81c21d-kube-api-access-96j2s\") pod \"package-server-manager-789f6589d5-mxcxh\" (UID: \"92f72a27-7281-40cc-89e0-e0424b81c21d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.973591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" Mar 08 05:29:49 crc kubenswrapper[4717]: W0308 05:29:49.973648 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e73a19_e7c1_4504_8499_4566b10f2682.slice/crio-8a2f80a272d4ff19215103312cb8e503783c32525c0419366dd7feec5a3d7b3a WatchSource:0}: Error finding container 8a2f80a272d4ff19215103312cb8e503783c32525c0419366dd7feec5a3d7b3a: Status 404 returned error can't find the container with id 8a2f80a272d4ff19215103312cb8e503783c32525c0419366dd7feec5a3d7b3a Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.976381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5gv\" (UniqueName: \"kubernetes.io/projected/1e41d744-e947-4c09-be5d-343f5a6d2bd1-kube-api-access-qq5gv\") pod \"machine-approver-56656f9798-hd2k4\" (UID: \"1e41d744-e947-4c09-be5d-343f5a6d2bd1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:49 crc kubenswrapper[4717]: I0308 05:29:49.993125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.000175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.002464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.003198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmj5\" (UniqueName: \"kubernetes.io/projected/311276df-2e61-4ff4-bfe0-1e6bf4e327dd-kube-api-access-krmj5\") pod \"machine-config-operator-74547568cd-dfjxc\" (UID: \"311276df-2e61-4ff4-bfe0-1e6bf4e327dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.017899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.022251 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ppg2t"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.034575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvj8\" (UniqueName: \"kubernetes.io/projected/73c6f272-9791-479e-8dde-b761d5da5b75-kube-api-access-nsvj8\") pod \"apiserver-7bbb656c7d-ct6fm\" (UID: \"73c6f272-9791-479e-8dde-b761d5da5b75\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.041520 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.042050 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.049918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8bmn\" (UniqueName: \"kubernetes.io/projected/5287b01d-2b35-4de8-8a24-f8fa7e778bc0-kube-api-access-x8bmn\") pod \"console-operator-58897d9998-jvkgl\" (UID: \"5287b01d-2b35-4de8-8a24-f8fa7e778bc0\") " pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.053949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bdgwk" event={"ID":"f8e73a19-e7c1-4504-8499-4566b10f2682","Type":"ContainerStarted","Data":"8a2f80a272d4ff19215103312cb8e503783c32525c0419366dd7feec5a3d7b3a"} Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.062133 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.067611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clhsv\" (UniqueName: \"kubernetes.io/projected/94a4bc63-d223-4305-abe4-a9a259db716d-kube-api-access-clhsv\") pod \"openshift-config-operator-7777fb866f-kvb2r\" (UID: \"94a4bc63-d223-4305-abe4-a9a259db716d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.081062 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.105718 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.121411 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.144997 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.161317 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.181763 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.187306 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.222194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99vz\" (UniqueName: \"kubernetes.io/projected/6102a82e-ea92-4f30-9eba-0cf4b19a3d87-kube-api-access-b99vz\") pod \"catalog-operator-68c6474976-d9cd6\" (UID: \"6102a82e-ea92-4f30-9eba-0cf4b19a3d87\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.222487 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.254525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.258471 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.259059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs8lw\" (UniqueName: \"kubernetes.io/projected/915a9413-e8fc-4441-b80a-f2a24186ad76-kube-api-access-cs8lw\") pod \"migrator-59844c95c7-smfpz\" (UID: \"915a9413-e8fc-4441-b80a-f2a24186ad76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.263825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.279325 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cw7dl"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.281029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.290399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zg7z\" (UniqueName: \"kubernetes.io/projected/6bb8daf7-8d77-44c2-ab01-b02257d17ac9-kube-api-access-8zg7z\") pod \"downloads-7954f5f757-hgxsr\" (UID: \"6bb8daf7-8d77-44c2-ab01-b02257d17ac9\") " pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.297475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.304126 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.312467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4tw\" (UniqueName: \"kubernetes.io/projected/bb782ddc-5c1b-4352-9c44-d8ada04559e0-kube-api-access-fj4tw\") pod \"ingress-operator-5b745b69d9-xbmwz\" (UID: \"bb782ddc-5c1b-4352-9c44-d8ada04559e0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.318027 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.320928 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.321100 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.321412 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.322625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.355487 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.356002 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.425825 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c5mwq"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.426783 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.462833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.463040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98fp\" (UniqueName: \"kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.463069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.463108 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.463252 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.464142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-etcd-client\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.464260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79gw\" (UniqueName: \"kubernetes.io/projected/731d619c-5e97-4335-9d85-008faeb45d03-kube-api-access-x79gw\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466309 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/141bf3de-a931-4d7d-9957-34f4c180819a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466384 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-srv-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-config\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2kr\" (UniqueName: \"kubernetes.io/projected/efe1d476-e883-4331-a1ab-376353731509-kube-api-access-xm2kr\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466515 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b37d34e-7f32-4fe6-b26d-09e780f37d86-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466583 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b37d34e-7f32-4fe6-b26d-09e780f37d86-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466695 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tlq\" (UniqueName: \"kubernetes.io/projected/10744e37-3152-49e4-ac03-d7eaa6a3439e-kube-api-access-v5tlq\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.466815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473480 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bh6\" (UniqueName: \"kubernetes.io/projected/141bf3de-a931-4d7d-9957-34f4c180819a-kube-api-access-c5bh6\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99x88\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473639 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-service-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473779 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10744e37-3152-49e4-ac03-d7eaa6a3439e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473892 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-serving-cert\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.473930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.474138 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b37d34e-7f32-4fe6-b26d-09e780f37d86-config\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.474177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.474212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10744e37-3152-49e4-ac03-d7eaa6a3439e-proxy-tls\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.474342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.474404 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.481897 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:50.981875102 +0000 UTC m=+217.899523946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.549136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.557542 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.576514 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.576761 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.076719677 +0000 UTC m=+217.994368521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.576824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-key\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.576881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.576948 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.576985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m98fp\" (UniqueName: \"kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577070 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577268 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-csi-data-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-cabundle\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-etcd-client\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577361 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0507da4e-a2d5-43c2-b5e2-25f42085431c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swwz\" (UniqueName: \"kubernetes.io/projected/efbfbedd-eea1-4275-b62f-d70bdec887a4-kube-api-access-9swwz\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f38a82-66cb-4dbd-a270-4e2000042f25-serving-cert\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577444 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x79gw\" (UniqueName: \"kubernetes.io/projected/731d619c-5e97-4335-9d85-008faeb45d03-kube-api-access-x79gw\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/141bf3de-a931-4d7d-9957-34f4c180819a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577480 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-config-volume\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577499 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57f7\" (UniqueName: \"kubernetes.io/projected/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-kube-api-access-p57f7\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577518 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-srv-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-config\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplfx\" (UniqueName: \"kubernetes.io/projected/3cdfb7b8-6ad7-4243-b963-e96432a2500b-kube-api-access-lplfx\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2kr\" (UniqueName: \"kubernetes.io/projected/efe1d476-e883-4331-a1ab-376353731509-kube-api-access-xm2kr\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b37d34e-7f32-4fe6-b26d-09e780f37d86-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9b12aad-95fa-4dc9-87b9-eb31c285f487-cert\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577672 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b37d34e-7f32-4fe6-b26d-09e780f37d86-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.577728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tlq\" (UniqueName: \"kubernetes.io/projected/10744e37-3152-49e4-ac03-d7eaa6a3439e-kube-api-access-v5tlq\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-registration-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578402 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmp26\" (UniqueName: \"kubernetes.io/projected/113c7a12-4f47-40ba-be5e-abf62359ffe3-kube-api-access-lmp26\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wp8d\" (UniqueName: \"kubernetes.io/projected/b9b12aad-95fa-4dc9-87b9-eb31c285f487-kube-api-access-8wp8d\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578444 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bh6\" (UniqueName: \"kubernetes.io/projected/141bf3de-a931-4d7d-9957-34f4c180819a-kube-api-access-c5bh6\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578464 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-node-bootstrap-token\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578487 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfjr\" (UniqueName: \"kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr\") pod \"auto-csr-approver-29549128-gnphx\" (UID: \"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2\") " pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.578515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99x88\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.579287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.579746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-config\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.580219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.580336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-plugins-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.580413 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-service-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.580830 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.581100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-socket-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.581156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10744e37-3152-49e4-ac03-d7eaa6a3439e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.581205 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-metrics-tls\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.581900 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10744e37-3152-49e4-ac03-d7eaa6a3439e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.581967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-serving-cert\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582486 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qg5\" (UniqueName: \"kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7x2p\" (UniqueName: \"kubernetes.io/projected/91f38a82-66cb-4dbd-a270-4e2000042f25-kube-api-access-j7x2p\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64ttc\" (UniqueName: \"kubernetes.io/projected/0507da4e-a2d5-43c2-b5e2-25f42085431c-kube-api-access-64ttc\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.582998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f38a82-66cb-4dbd-a270-4e2000042f25-config\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.583040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b37d34e-7f32-4fe6-b26d-09e780f37d86-config\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.583678 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.583728 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b37d34e-7f32-4fe6-b26d-09e780f37d86-config\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.583817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10744e37-3152-49e4-ac03-d7eaa6a3439e-proxy-tls\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.583899 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqftc\" (UniqueName: \"kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.585192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.586866 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-mountpoint-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.587128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.587190 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.587362 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-certs\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.588072 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.088050721 +0000 UTC m=+218.005699565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.588142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-serving-cert\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.588633 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10744e37-3152-49e4-ac03-d7eaa6a3439e-proxy-tls\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.588843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-service-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.589554 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5tthm"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.589931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-srv-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.589950 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.591087 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/141bf3de-a931-4d7d-9957-34f4c180819a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.591550 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/731d619c-5e97-4335-9d85-008faeb45d03-etcd-client\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.591760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/efe1d476-e883-4331-a1ab-376353731509-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.591870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/731d619c-5e97-4335-9d85-008faeb45d03-etcd-ca\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.591878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.592100 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.595234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b37d34e-7f32-4fe6-b26d-09e780f37d86-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.595818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: W0308 05:29:50.624213 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a32992d_6f35_4172_8081_9f64b078e2b3.slice/crio-14d4bb5e15a900066dc679a61d98c1d613371a576a084c5d6d65ee0fde7d9f48 WatchSource:0}: Error finding container 14d4bb5e15a900066dc679a61d98c1d613371a576a084c5d6d65ee0fde7d9f48: Status 404 returned error can't find the container with id 14d4bb5e15a900066dc679a61d98c1d613371a576a084c5d6d65ee0fde7d9f48 Mar 08 05:29:50 crc kubenswrapper[4717]: W0308 05:29:50.629336 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7edd6b0f_e7fd_4133_9d1e_8a7b7356b077.slice/crio-5704fe6b2aa7e2a2c513809c2d80454eb03a4ae0c512c356ad950b2e35e68938 WatchSource:0}: Error finding container 5704fe6b2aa7e2a2c513809c2d80454eb03a4ae0c512c356ad950b2e35e68938: Status 404 returned error can't find the container with id 5704fe6b2aa7e2a2c513809c2d80454eb03a4ae0c512c356ad950b2e35e68938 Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.631191 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98fp\" (UniqueName: \"kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp\") pod \"route-controller-manager-6576b87f9c-z2rl6\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.646757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79gw\" (UniqueName: \"kubernetes.io/projected/731d619c-5e97-4335-9d85-008faeb45d03-kube-api-access-x79gw\") pod \"etcd-operator-b45778765-8lcq6\" (UID: \"731d619c-5e97-4335-9d85-008faeb45d03\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.665431 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2kr\" (UniqueName: \"kubernetes.io/projected/efe1d476-e883-4331-a1ab-376353731509-kube-api-access-xm2kr\") pod \"olm-operator-6b444d44fb-mrdlw\" (UID: \"efe1d476-e883-4331-a1ab-376353731509\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.675140 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.695793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b37d34e-7f32-4fe6-b26d-09e780f37d86-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrbvv\" (UID: \"0b37d34e-7f32-4fe6-b26d-09e780f37d86\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.695843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.695944 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.195922575 +0000 UTC m=+218.113571419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-csi-data-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696237 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-cabundle\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696287 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0507da4e-a2d5-43c2-b5e2-25f42085431c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696315 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swwz\" (UniqueName: \"kubernetes.io/projected/efbfbedd-eea1-4275-b62f-d70bdec887a4-kube-api-access-9swwz\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696333 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f38a82-66cb-4dbd-a270-4e2000042f25-serving-cert\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-config-volume\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57f7\" (UniqueName: \"kubernetes.io/projected/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-kube-api-access-p57f7\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplfx\" (UniqueName: \"kubernetes.io/projected/3cdfb7b8-6ad7-4243-b963-e96432a2500b-kube-api-access-lplfx\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9b12aad-95fa-4dc9-87b9-eb31c285f487-cert\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-registration-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmp26\" (UniqueName: \"kubernetes.io/projected/113c7a12-4f47-40ba-be5e-abf62359ffe3-kube-api-access-lmp26\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wp8d\" (UniqueName: \"kubernetes.io/projected/b9b12aad-95fa-4dc9-87b9-eb31c285f487-kube-api-access-8wp8d\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-node-bootstrap-token\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.696571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfjr\" (UniqueName: \"kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr\") pod \"auto-csr-approver-29549128-gnphx\" (UID: \"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2\") " pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.697167 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-registration-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.697371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-csi-data-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.698944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699126 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-cabundle\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699133 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-plugins-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-socket-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-metrics-tls\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-plugins-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699336 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qg5\" (UniqueName: \"kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699360 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7x2p\" (UniqueName: \"kubernetes.io/projected/91f38a82-66cb-4dbd-a270-4e2000042f25-kube-api-access-j7x2p\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64ttc\" (UniqueName: \"kubernetes.io/projected/0507da4e-a2d5-43c2-b5e2-25f42085431c-kube-api-access-64ttc\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.699439 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f38a82-66cb-4dbd-a270-4e2000042f25-config\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqftc\" (UniqueName: \"kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-mountpoint-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700233 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-certs\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-key\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.700319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.701112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9b12aad-95fa-4dc9-87b9-eb31c285f487-cert\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.701412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.702504 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-config-volume\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.702526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f38a82-66cb-4dbd-a270-4e2000042f25-config\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.702799 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.202784723 +0000 UTC m=+218.120433567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.702920 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-mountpoint-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.705951 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-certs\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.706992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99x88\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.707087 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.711897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/efbfbedd-eea1-4275-b62f-d70bdec887a4-socket-dir\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.715261 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/113c7a12-4f47-40ba-be5e-abf62359ffe3-node-bootstrap-token\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.726896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-metrics-tls\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.727373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.736775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0507da4e-a2d5-43c2-b5e2-25f42085431c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.736837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f38a82-66cb-4dbd-a270-4e2000042f25-serving-cert\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.740205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3cdfb7b8-6ad7-4243-b963-e96432a2500b-signing-key\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.747885 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bh6\" (UniqueName: \"kubernetes.io/projected/141bf3de-a931-4d7d-9957-34f4c180819a-kube-api-access-c5bh6\") pod \"multus-admission-controller-857f4d67dd-smkdd\" (UID: \"141bf3de-a931-4d7d-9957-34f4c180819a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.759556 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tlq\" (UniqueName: \"kubernetes.io/projected/10744e37-3152-49e4-ac03-d7eaa6a3439e-kube-api-access-v5tlq\") pod \"machine-config-controller-84d6567774-jddpv\" (UID: \"10744e37-3152-49e4-ac03-d7eaa6a3439e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: W0308 05:29:50.762153 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2daede_7003_4bcc_9a92_6342eb319181.slice/crio-68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200 WatchSource:0}: Error finding container 68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200: Status 404 returned error can't find the container with id 68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200 Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.772163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.805017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.805647 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.305624376 +0000 UTC m=+218.223273220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.813196 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.817297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57f7\" (UniqueName: \"kubernetes.io/projected/b9f2cba0-2a4b-442e-aa97-aed7413ed0f6-kube-api-access-p57f7\") pod \"dns-default-lvb9t\" (UID: \"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6\") " pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.822530 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.826466 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.841629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wp8d\" (UniqueName: \"kubernetes.io/projected/b9b12aad-95fa-4dc9-87b9-eb31c285f487-kube-api-access-8wp8d\") pod \"ingress-canary-5d44b\" (UID: \"b9b12aad-95fa-4dc9-87b9-eb31c285f487\") " pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.842048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplfx\" (UniqueName: \"kubernetes.io/projected/3cdfb7b8-6ad7-4243-b963-e96432a2500b-kube-api-access-lplfx\") pod \"service-ca-9c57cc56f-hp5l5\" (UID: \"3cdfb7b8-6ad7-4243-b963-e96432a2500b\") " pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.847033 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.847401 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.852507 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.863856 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmp26\" (UniqueName: \"kubernetes.io/projected/113c7a12-4f47-40ba-be5e-abf62359ffe3-kube-api-access-lmp26\") pod \"machine-config-server-6jlgw\" (UID: \"113c7a12-4f47-40ba-be5e-abf62359ffe3\") " pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.874338 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.886417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swwz\" (UniqueName: \"kubernetes.io/projected/efbfbedd-eea1-4275-b62f-d70bdec887a4-kube-api-access-9swwz\") pod \"csi-hostpathplugin-szjtn\" (UID: \"efbfbedd-eea1-4275-b62f-d70bdec887a4\") " pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.906981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:50 crc kubenswrapper[4717]: E0308 05:29:50.909146 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.409129156 +0000 UTC m=+218.326778000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.910440 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.920829 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfjr\" (UniqueName: \"kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr\") pod \"auto-csr-approver-29549128-gnphx\" (UID: \"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2\") " pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:29:50 crc kubenswrapper[4717]: W0308 05:29:50.922007 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd6302fd_1260_4793_9d9b_2dbfba20a013.slice/crio-f54d671ec190963a10f6a2cab212f99b38a595e6fb3258e051efa64567d9927d WatchSource:0}: Error finding container f54d671ec190963a10f6a2cab212f99b38a595e6fb3258e051efa64567d9927d: Status 404 returned error can't find the container with id f54d671ec190963a10f6a2cab212f99b38a595e6fb3258e051efa64567d9927d Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.924600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.925648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qg5\" (UniqueName: \"kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5\") pod \"marketplace-operator-79b997595-zhjjq\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.933949 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.948579 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r"] Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.948812 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7x2p\" (UniqueName: \"kubernetes.io/projected/91f38a82-66cb-4dbd-a270-4e2000042f25-kube-api-access-j7x2p\") pod \"service-ca-operator-777779d784-5fgj8\" (UID: \"91f38a82-66cb-4dbd-a270-4e2000042f25\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.950098 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz"] Mar 08 05:29:50 crc kubenswrapper[4717]: W0308 05:29:50.951519 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f72a27_7281_40cc_89e0_e0424b81c21d.slice/crio-399578e4d6e55fd76a34f8435f04de4a6a8ae26d0373f39fcacf1af8a7cf604d WatchSource:0}: Error finding container 399578e4d6e55fd76a34f8435f04de4a6a8ae26d0373f39fcacf1af8a7cf604d: Status 404 returned error can't find the container with id 399578e4d6e55fd76a34f8435f04de4a6a8ae26d0373f39fcacf1af8a7cf604d Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.957605 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.961615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64ttc\" (UniqueName: \"kubernetes.io/projected/0507da4e-a2d5-43c2-b5e2-25f42085431c-kube-api-access-64ttc\") pod \"control-plane-machine-set-operator-78cbb6b69f-hhrcp\" (UID: \"0507da4e-a2d5-43c2-b5e2-25f42085431c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.964363 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.975760 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" Mar 08 05:29:50 crc kubenswrapper[4717]: I0308 05:29:50.993422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.001416 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.008928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.009335 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.010112 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.51009178 +0000 UTC m=+218.427740624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.010191 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.010674 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.510666155 +0000 UTC m=+218.428314999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.014360 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.014777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqftc\" (UniqueName: \"kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc\") pod \"collect-profiles-29549115-zkqg9\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.017751 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.019700 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.028647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.036706 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5d44b" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.053407 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.057930 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jvkgl"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.059132 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hgxsr"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.075267 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6jlgw" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.081861 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.081902 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.096588 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" event={"ID":"f2c2b864-898a-4b84-a3ab-168051c21e34","Type":"ContainerStarted","Data":"eef0185e06935f519c029990a6315dfc97df336c0ff20101cb5e110f25d3eab0"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.098929 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz"] Mar 08 05:29:51 crc kubenswrapper[4717]: W0308 05:29:51.105875 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb8daf7_8d77_44c2_ab01_b02257d17ac9.slice/crio-3a9d6abdc429dd36f793c8a9a4e75da908c8f02e605b1a25d8d8cfccbd9cb829 WatchSource:0}: Error finding container 3a9d6abdc429dd36f793c8a9a4e75da908c8f02e605b1a25d8d8cfccbd9cb829: Status 404 returned error can't find the container with id 3a9d6abdc429dd36f793c8a9a4e75da908c8f02e605b1a25d8d8cfccbd9cb829 Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.107535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" event={"ID":"4f7b6ea6-9a3d-432d-a034-956e93323452","Type":"ContainerStarted","Data":"b2872fdcce53abdbf1bafbe1003e899107c09a0e84ffce928aa689774813b461"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.113358 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.113766 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.613743993 +0000 UTC m=+218.531392837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.130564 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" event={"ID":"f71b5e8a-6657-41d6-a447-ca755016bed2","Type":"ContainerStarted","Data":"1a951221afd3928e01c55a3513d85adc4c3ea52d9bf8ad4d5f77e0ee6a7693d1"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.133877 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" event={"ID":"bb0868af-d764-41ec-a4f8-0d7086fbb1cc","Type":"ContainerStarted","Data":"a29f15edb8afc7f597962a281ba564b10cc683939c092b7647b7e5ba75da6080"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.133907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" event={"ID":"bb0868af-d764-41ec-a4f8-0d7086fbb1cc","Type":"ContainerStarted","Data":"cee4c85f147b055a04df0d9bc453a6aff4aac9bad32e642675b339a5a923be2b"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.137024 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" event={"ID":"341123d7-044c-40c6-85bc-7a1685c07046","Type":"ContainerStarted","Data":"46dd6194c15462f62c46e89b665a0faa63e30891cf79f1e647e29ab9d984c6d8"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.145389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" event={"ID":"8a32992d-6f35-4172-8081-9f64b078e2b3","Type":"ContainerStarted","Data":"8a74bf9de5394420c1a07f9e7e680b5a79da4309a7d866bf9cb85c7d9c880035"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.145820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" event={"ID":"8a32992d-6f35-4172-8081-9f64b078e2b3","Type":"ContainerStarted","Data":"14d4bb5e15a900066dc679a61d98c1d613371a576a084c5d6d65ee0fde7d9f48"} Mar 08 05:29:51 crc kubenswrapper[4717]: W0308 05:29:51.150116 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb782ddc_5c1b_4352_9c44_d8ada04559e0.slice/crio-71bee77f97eda9bda03391a6932fa9535f3277d11634d7f9a8e9049b2ab00a88 WatchSource:0}: Error finding container 71bee77f97eda9bda03391a6932fa9535f3277d11634d7f9a8e9049b2ab00a88: Status 404 returned error can't find the container with id 71bee77f97eda9bda03391a6932fa9535f3277d11634d7f9a8e9049b2ab00a88 Mar 08 05:29:51 crc kubenswrapper[4717]: W0308 05:29:51.156006 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod311276df_2e61_4ff4_bfe0_1e6bf4e327dd.slice/crio-0809ac3a90a257efa5e0a0d1beb7cdbdca0729740dd44b654e9785d5dc678d99 WatchSource:0}: Error finding container 0809ac3a90a257efa5e0a0d1beb7cdbdca0729740dd44b654e9785d5dc678d99: Status 404 returned error can't find the container with id 0809ac3a90a257efa5e0a0d1beb7cdbdca0729740dd44b654e9785d5dc678d99 Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.165665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" event={"ID":"fd6302fd-1260-4793-9d9b-2dbfba20a013","Type":"ContainerStarted","Data":"f54d671ec190963a10f6a2cab212f99b38a595e6fb3258e051efa64567d9927d"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.166841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" event={"ID":"1e41d744-e947-4c09-be5d-343f5a6d2bd1","Type":"ContainerStarted","Data":"61fe1ecab6877e81f8ad67abbf60bc268a24ef0c668588beb24fdd43a3755f59"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.215203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.216827 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" event={"ID":"d726a0f6-4858-4e71-8513-75c63f0bfb8d","Type":"ContainerStarted","Data":"70e065f49ca3fc6fdde210efc9868ce1acd64c944fc70a04e9a5a67eafa93db2"} Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.217190 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.717172101 +0000 UTC m=+218.634820945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.257125 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bdgwk" event={"ID":"f8e73a19-e7c1-4504-8499-4566b10f2682","Type":"ContainerStarted","Data":"060042e0ea5b6d52b5e6f2ead1c799e3102adc62a77b7218f5ebf816f46bec20"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.274622 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.300125 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" event={"ID":"92f72a27-7281-40cc-89e0-e0424b81c21d","Type":"ContainerStarted","Data":"399578e4d6e55fd76a34f8435f04de4a6a8ae26d0373f39fcacf1af8a7cf604d"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.318331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.318722 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.8186948 +0000 UTC m=+218.736343644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.318773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.319556 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.819548292 +0000 UTC m=+218.737197136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.319543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zhsbw" event={"ID":"cd118c79-042d-48f5-a360-884f4466f65b","Type":"ContainerStarted","Data":"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.319630 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zhsbw" event={"ID":"cd118c79-042d-48f5-a360-884f4466f65b","Type":"ContainerStarted","Data":"79ec35763aee427ccc25a0b90c8d02265062968bba38bd95e4cce080a036438b"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.327582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" event={"ID":"4ca60946-75d5-469e-84f0-d200ca8c0cfd","Type":"ContainerStarted","Data":"e97988897c5835f5ee56c44e41710413f76d9d4a9bb65f9f1ed9d213746da63c"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.332031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" event={"ID":"4ca60946-75d5-469e-84f0-d200ca8c0cfd","Type":"ContainerStarted","Data":"b5c9e2b0e7a628d50b7553d14577427425d10985509493a879cd958244d9cd24"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.332065 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.332092 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" event={"ID":"4ca60946-75d5-469e-84f0-d200ca8c0cfd","Type":"ContainerStarted","Data":"4b16b9af2d15f1e3f01bbf98ac03e2c3ac5c970ad6b93d78d84e5ecffb1154c4"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.336495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" event={"ID":"f2de96d0-d47e-4240-832d-c9b1e1c882df","Type":"ContainerStarted","Data":"dbe1e79d699afa14b45f81aba1596899e657267e534b6868ffe9b9baf8a4cb52"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.336585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" event={"ID":"f2de96d0-d47e-4240-832d-c9b1e1c882df","Type":"ContainerStarted","Data":"031c898fb5340e6a3747754d7ec77f05c4c3b25d6922ccc1e00466c7413e58ae"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.337063 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.338894 4717 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mxg8l container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.338948 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.340224 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" event={"ID":"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077","Type":"ContainerStarted","Data":"00e256d039dcff542c479f82c873646895d2d7299dc4d3beaf3c136beed3d5ac"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.340258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" event={"ID":"7edd6b0f-e7fd-4133-9d1e-8a7b7356b077","Type":"ContainerStarted","Data":"5704fe6b2aa7e2a2c513809c2d80454eb03a4ae0c512c356ad950b2e35e68938"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.343010 4717 generic.go:334] "Generic (PLEG): container finished" podID="11bd956f-1b8e-461d-b42b-50f1b7417607" containerID="9ea4435d37923a97d4202e0aa1c31fc16684ad7e2644084c80fdb14498523dc9" exitCode=0 Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.343114 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" event={"ID":"11bd956f-1b8e-461d-b42b-50f1b7417607","Type":"ContainerDied","Data":"9ea4435d37923a97d4202e0aa1c31fc16684ad7e2644084c80fdb14498523dc9"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.343159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" event={"ID":"11bd956f-1b8e-461d-b42b-50f1b7417607","Type":"ContainerStarted","Data":"f589428c38f35a2f586eba1bd61ecfb7be1717d0bd13ba7dce05785df7ca0e3d"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.348595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" event={"ID":"3b2daede-7003-4bcc-9a92-6342eb319181","Type":"ContainerStarted","Data":"68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200"} Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.349899 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.351167 4717 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hwxhw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.351279 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.421335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.421465 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.92143462 +0000 UTC m=+218.839083464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.421781 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.424232 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:51.924205422 +0000 UTC m=+218.841854476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.524581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.525165 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.025132365 +0000 UTC m=+218.942781199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.525240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.526406 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.026390227 +0000 UTC m=+218.944039071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.626889 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.627653 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.127628098 +0000 UTC m=+219.045276942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.730076 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.731211 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.731522 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hp5l5"] Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.731569 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.231554209 +0000 UTC m=+219.149203043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.761581 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smkdd"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.837065 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.837431 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.33740891 +0000 UTC m=+219.255057754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: W0308 05:29:51.859599 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113c7a12_4f47_40ba_be5e_abf62359ffe3.slice/crio-a4a9ba7426e2590e00557c255dacec31213facd185e6049e36d59128762d0ada WatchSource:0}: Error finding container a4a9ba7426e2590e00557c255dacec31213facd185e6049e36d59128762d0ada: Status 404 returned error can't find the container with id a4a9ba7426e2590e00557c255dacec31213facd185e6049e36d59128762d0ada Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.890675 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvb9t"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.890769 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8lcq6"] Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.942473 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:51 crc kubenswrapper[4717]: E0308 05:29:51.945128 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.445095599 +0000 UTC m=+219.362744433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.951675 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.970879 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:51 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:51 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:51 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:51 crc kubenswrapper[4717]: I0308 05:29:51.971365 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.045412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.046037 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.546015922 +0000 UTC m=+219.463664766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.157753 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.158371 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.658347211 +0000 UTC m=+219.575996055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.232585 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.265714 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.270128 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.770097395 +0000 UTC m=+219.687746239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.276586 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.277094 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.777065906 +0000 UTC m=+219.694714740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.292257 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" podStartSLOduration=159.292232001 podStartE2EDuration="2m39.292232001s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.292017685 +0000 UTC m=+219.209666529" watchObservedRunningTime="2026-03-08 05:29:52.292232001 +0000 UTC m=+219.209880845" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.378432 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.378732 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:52.878714548 +0000 UTC m=+219.796363392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.396806 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cjzqs" podStartSLOduration=159.396789468 podStartE2EDuration="2m39.396789468s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.355076704 +0000 UTC m=+219.272725548" watchObservedRunningTime="2026-03-08 05:29:52.396789468 +0000 UTC m=+219.314438312" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.442288 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zhsbw" podStartSLOduration=159.442249509 podStartE2EDuration="2m39.442249509s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.440690269 +0000 UTC m=+219.358339113" watchObservedRunningTime="2026-03-08 05:29:52.442249509 +0000 UTC m=+219.359898353" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.471856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" event={"ID":"141bf3de-a931-4d7d-9957-34f4c180819a","Type":"ContainerStarted","Data":"ae5baf643896c7cd172a8ca897f741e09f08b4d3938e4775c54f1922ea6a54f4"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.490844 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" podStartSLOduration=160.490817942 podStartE2EDuration="2m40.490817942s" podCreationTimestamp="2026-03-08 05:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.486995712 +0000 UTC m=+219.404644556" watchObservedRunningTime="2026-03-08 05:29:52.490817942 +0000 UTC m=+219.408466786" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.492131 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.532436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" event={"ID":"0b37d34e-7f32-4fe6-b26d-09e780f37d86","Type":"ContainerStarted","Data":"0c22fbbb2d29e9b3dc7faaec97559312f179ca09a0a09615bc83ed775c892020"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.549123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.549628 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.04961135 +0000 UTC m=+219.967260194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.624604 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ppg2t" podStartSLOduration=159.624583998 podStartE2EDuration="2m39.624583998s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.568311976 +0000 UTC m=+219.485960820" watchObservedRunningTime="2026-03-08 05:29:52.624583998 +0000 UTC m=+219.542232842" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.632464 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.642725 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" event={"ID":"3cdfb7b8-6ad7-4243-b963-e96432a2500b","Type":"ContainerStarted","Data":"09ccc954e88a31b6477ea43da70d50bac9c33a0db18f812397b742f87aa43753"} Mar 08 05:29:52 crc kubenswrapper[4717]: W0308 05:29:52.642989 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f38a82_66cb_4dbd_a270_4e2000042f25.slice/crio-7884c929477479fd8f1a61c19fd15b77922993e37d3f78284385c74d0c0b28c5 WatchSource:0}: Error finding container 7884c929477479fd8f1a61c19fd15b77922993e37d3f78284385c74d0c0b28c5: Status 404 returned error can't find the container with id 7884c929477479fd8f1a61c19fd15b77922993e37d3f78284385c74d0c0b28c5 Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.649920 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5d44b"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.652789 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5tthm" podStartSLOduration=160.65276684 podStartE2EDuration="2m40.65276684s" podCreationTimestamp="2026-03-08 05:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.62197846 +0000 UTC m=+219.539627304" watchObservedRunningTime="2026-03-08 05:29:52.65276684 +0000 UTC m=+219.570415684" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.654493 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.669134 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.669800 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.169777892 +0000 UTC m=+220.087426736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.761252 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-szjtn"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.766127 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" event={"ID":"d726a0f6-4858-4e71-8513-75c63f0bfb8d","Type":"ContainerStarted","Data":"8d0d76a950b2dddd8438dadcde2ea9fe1ca96ee6523c6cbfa110c97b111a5a9f"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.767432 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-67vch" podStartSLOduration=160.767418459 podStartE2EDuration="2m40.767418459s" podCreationTimestamp="2026-03-08 05:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.767149882 +0000 UTC m=+219.684798726" watchObservedRunningTime="2026-03-08 05:29:52.767418459 +0000 UTC m=+219.685067303" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.773671 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.774099 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.274082752 +0000 UTC m=+220.191731596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.793886 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549128-gnphx"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.808186 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp"] Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.873259 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" event={"ID":"5287b01d-2b35-4de8-8a24-f8fa7e778bc0","Type":"ContainerStarted","Data":"c869981223e289d16ed7cb87a2d83674ac8ae46b9d21c643f2aabc27d70dd94f"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.874485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.874845 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.37482669 +0000 UTC m=+220.292475534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: W0308 05:29:52.879838 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda83d24b_fb47_44b9_a05e_228eabe397cf.slice/crio-c1a7f987a888687725d237dace01d3a84f47f4e1844d05c5c65975d1299a3570 WatchSource:0}: Error finding container c1a7f987a888687725d237dace01d3a84f47f4e1844d05c5c65975d1299a3570: Status 404 returned error can't find the container with id c1a7f987a888687725d237dace01d3a84f47f4e1844d05c5c65975d1299a3570 Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.921665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" event={"ID":"f2c2b864-898a-4b84-a3ab-168051c21e34","Type":"ContainerStarted","Data":"597b20df3dee6ad756c886e80b82afb3324f592e744b61133410ea5783f1c1e9"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.934880 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" event={"ID":"bb782ddc-5c1b-4352-9c44-d8ada04559e0","Type":"ContainerStarted","Data":"2167b7e910c2be19122660ecc9549faa509a14ec59983101850310c94f2beb53"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.934951 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" event={"ID":"bb782ddc-5c1b-4352-9c44-d8ada04559e0","Type":"ContainerStarted","Data":"71bee77f97eda9bda03391a6932fa9535f3277d11634d7f9a8e9049b2ab00a88"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.936994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvb9t" event={"ID":"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6","Type":"ContainerStarted","Data":"04c6b006769359b777334a4ee5a9251a83352a8fbddad7be3b0bc37592b41639"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.943316 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" event={"ID":"175ee8df-26ba-40d0-a30e-6f5bcb5435b7","Type":"ContainerStarted","Data":"2aca45b5dee02b883a8261f3ee4111158ddd859e5786a070826c4056a1364656"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.945716 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.948719 4717 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-z2rl6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.948778 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.958542 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" event={"ID":"73c6f272-9791-479e-8dde-b761d5da5b75","Type":"ContainerStarted","Data":"7f0a26f826ab996d6b2cf860ff9ce61fcfbc9606b2817099d92d0212453b27b6"} Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.972077 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:52 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:52 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:52 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.972132 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.977946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:52 crc kubenswrapper[4717]: E0308 05:29:52.979351 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.479336097 +0000 UTC m=+220.396984941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.983087 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" podStartSLOduration=159.983064093 podStartE2EDuration="2m39.983064093s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.97755339 +0000 UTC m=+219.895202234" watchObservedRunningTime="2026-03-08 05:29:52.983064093 +0000 UTC m=+219.900712937" Mar 08 05:29:52 crc kubenswrapper[4717]: I0308 05:29:52.983480 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-bdgwk" podStartSLOduration=159.983474244 podStartE2EDuration="2m39.983474244s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:52.858606449 +0000 UTC m=+219.776255293" watchObservedRunningTime="2026-03-08 05:29:52.983474244 +0000 UTC m=+219.901123088" Mar 08 05:29:52 crc kubenswrapper[4717]: W0308 05:29:52.989322 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b12aad_95fa_4dc9_87b9_eb31c285f487.slice/crio-37b177ec92170ba75dff8d9ebdd34586ec24763d5198def1b5069367cdf26217 WatchSource:0}: Error finding container 37b177ec92170ba75dff8d9ebdd34586ec24763d5198def1b5069367cdf26217: Status 404 returned error can't find the container with id 37b177ec92170ba75dff8d9ebdd34586ec24763d5198def1b5069367cdf26217 Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.049017 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" event={"ID":"311276df-2e61-4ff4-bfe0-1e6bf4e327dd","Type":"ContainerStarted","Data":"0809ac3a90a257efa5e0a0d1beb7cdbdca0729740dd44b654e9785d5dc678d99"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.058180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" event={"ID":"94a4bc63-d223-4305-abe4-a9a259db716d","Type":"ContainerStarted","Data":"82aa56d9a6d3a0b0653186e272e7b0132cc19cc40e8ee0f6460c4055a17b6728"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.058293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" event={"ID":"94a4bc63-d223-4305-abe4-a9a259db716d","Type":"ContainerStarted","Data":"6afe6289d2da2a2b4d2ab44c2ed6da7f9286d4f7f2011fc0f6ffab58044bcce1"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.079049 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.079490 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.579470579 +0000 UTC m=+220.497119423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.079544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.080186 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.580163527 +0000 UTC m=+220.497812551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.088014 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" event={"ID":"4f7b6ea6-9a3d-432d-a034-956e93323452","Type":"ContainerStarted","Data":"8f0e7f5ec7cd53d80fa66e04a18a93c52963ab3eb04d2466f22bc8971acd91fc"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.106486 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.110434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" event={"ID":"1e41d744-e947-4c09-be5d-343f5a6d2bd1","Type":"ContainerStarted","Data":"47b27750ff5a23934b1b3ebaa146280904a3fef62b522754dd9843dd31ed470d"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.123861 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qj6pl" podStartSLOduration=160.119672454 podStartE2EDuration="2m40.119672454s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.109635163 +0000 UTC m=+220.027284007" watchObservedRunningTime="2026-03-08 05:29:53.119672454 +0000 UTC m=+220.037321288" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.130424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" event={"ID":"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16","Type":"ContainerStarted","Data":"341a2596eae633cb4a15dc526d46e19abc4feb9f39930cc7586a05d134d38c2e"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.185018 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.185560 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.685521955 +0000 UTC m=+220.603170799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.186017 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.186419 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.686403148 +0000 UTC m=+220.604051992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.200745 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" podStartSLOduration=160.20070615 podStartE2EDuration="2m40.20070615s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.189097118 +0000 UTC m=+220.106745952" watchObservedRunningTime="2026-03-08 05:29:53.20070615 +0000 UTC m=+220.118354994" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.202610 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" podStartSLOduration=161.202600049 podStartE2EDuration="2m41.202600049s" podCreationTimestamp="2026-03-08 05:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.141747227 +0000 UTC m=+220.059396061" watchObservedRunningTime="2026-03-08 05:29:53.202600049 +0000 UTC m=+220.120248893" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.203934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" event={"ID":"3b2daede-7003-4bcc-9a92-6342eb319181","Type":"ContainerStarted","Data":"4f7178bea61a5d816f4de80ed570964febc8c0976cf1b97c4e5e81ed0b42bf1d"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.207252 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" event={"ID":"341123d7-044c-40c6-85bc-7a1685c07046","Type":"ContainerStarted","Data":"0e0070f44dea2f627bd5552d5a728c58988602c574a46d58b1c7be0552609959"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.213517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" event={"ID":"fd6302fd-1260-4793-9d9b-2dbfba20a013","Type":"ContainerStarted","Data":"fa80452d5e012a2f7f3ebd088f101350f3b843c6a3fc27189de53e97c4527e78"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.214770 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.217310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" event={"ID":"731d619c-5e97-4335-9d85-008faeb45d03","Type":"ContainerStarted","Data":"9b990d5fd9266641b60dbbee95ae45879dde49f82733a21c8a2d46c0f3e9caa2"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.222219 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.237664 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c4ppr" podStartSLOduration=160.237634039 podStartE2EDuration="2m40.237634039s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.230951076 +0000 UTC m=+220.148599920" watchObservedRunningTime="2026-03-08 05:29:53.237634039 +0000 UTC m=+220.155282873" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.241161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" event={"ID":"6102a82e-ea92-4f30-9eba-0cf4b19a3d87","Type":"ContainerStarted","Data":"b8cc953c95803d5011454296fecf1709ef90787180a42d5d4fae73d3f496b1c3"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.242582 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.262384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6jlgw" event={"ID":"113c7a12-4f47-40ba-be5e-abf62359ffe3","Type":"ContainerStarted","Data":"a4a9ba7426e2590e00557c255dacec31213facd185e6049e36d59128762d0ada"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.274041 4717 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-d9cd6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.274414 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" podUID="6102a82e-ea92-4f30-9eba-0cf4b19a3d87" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.280415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" event={"ID":"efe1d476-e883-4331-a1ab-376353731509","Type":"ContainerStarted","Data":"b9acd6a8d2ca5ecf5bdcf56ab015122400a837e5f84d3c57936767f4a22a158d"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.298674 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.299642 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.79962163 +0000 UTC m=+220.717270474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.299807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.306491 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.806470098 +0000 UTC m=+220.724118942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.310314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hgxsr" event={"ID":"6bb8daf7-8d77-44c2-ab01-b02257d17ac9","Type":"ContainerStarted","Data":"3a9d6abdc429dd36f793c8a9a4e75da908c8f02e605b1a25d8d8cfccbd9cb829"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.311376 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.341678 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" event={"ID":"10744e37-3152-49e4-ac03-d7eaa6a3439e","Type":"ContainerStarted","Data":"5a5326c557e2db8e35ba0f548ce7820aa4fae6579c1b59e779d1e227effe97f3"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.341919 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" podStartSLOduration=160.341894549 podStartE2EDuration="2m40.341894549s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.329466906 +0000 UTC m=+220.247115750" watchObservedRunningTime="2026-03-08 05:29:53.341894549 +0000 UTC m=+220.259543393" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.354946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" event={"ID":"915a9413-e8fc-4441-b80a-f2a24186ad76","Type":"ContainerStarted","Data":"cdbe8a1064e18e3e87b617b95645bb2b5cad28f44e0d19dc0957563cc3cde09b"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.355015 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" event={"ID":"915a9413-e8fc-4441-b80a-f2a24186ad76","Type":"ContainerStarted","Data":"c0f1dad8a5a32d29cc9a178d712ed362c5eba9b19f9c40ba2dd26dda2050323b"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.359571 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" event={"ID":"f71b5e8a-6657-41d6-a447-ca755016bed2","Type":"ContainerStarted","Data":"8b587c435d22fa27f10a3b61934223bf672943d592b2a0673275512cb0cca731"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.363021 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" event={"ID":"92f72a27-7281-40cc-89e0-e0424b81c21d","Type":"ContainerStarted","Data":"f3ab34655d62b34f5173b5c21f5a88ea077c5c94bf750d5445ecd423e9c0f980"} Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.370068 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-hgxsr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.370232 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hgxsr" podUID="6bb8daf7-8d77-44c2-ab01-b02257d17ac9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.389376 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.390043 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6jlgw" podStartSLOduration=6.39002475 podStartE2EDuration="6.39002475s" podCreationTimestamp="2026-03-08 05:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.389374093 +0000 UTC m=+220.307022937" watchObservedRunningTime="2026-03-08 05:29:53.39002475 +0000 UTC m=+220.307673594" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.402400 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.403580 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.903561561 +0000 UTC m=+220.821210405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.403782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.407618 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:53.907601496 +0000 UTC m=+220.825250340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.425855 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hgxsr" podStartSLOduration=160.42583592 podStartE2EDuration="2m40.42583592s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.424443684 +0000 UTC m=+220.342092528" watchObservedRunningTime="2026-03-08 05:29:53.42583592 +0000 UTC m=+220.343484764" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.452757 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" podStartSLOduration=160.45273944 podStartE2EDuration="2m40.45273944s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.448487399 +0000 UTC m=+220.366136243" watchObservedRunningTime="2026-03-08 05:29:53.45273944 +0000 UTC m=+220.370388284" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.494636 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7v6pp" podStartSLOduration=160.494610138 podStartE2EDuration="2m40.494610138s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.488394296 +0000 UTC m=+220.406043140" watchObservedRunningTime="2026-03-08 05:29:53.494610138 +0000 UTC m=+220.412258992" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.505944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.506334 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.006317512 +0000 UTC m=+220.923966346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.513544 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" podStartSLOduration=160.513506389 podStartE2EDuration="2m40.513506389s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:53.505400898 +0000 UTC m=+220.423049742" watchObservedRunningTime="2026-03-08 05:29:53.513506389 +0000 UTC m=+220.431155233" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.608979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.609399 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.109384981 +0000 UTC m=+221.027033825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.709843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.710073 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.210051127 +0000 UTC m=+221.127699971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.815027 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.315008054 +0000 UTC m=+221.232656888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.823058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.879120 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4l75s" Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.924533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:53 crc kubenswrapper[4717]: E0308 05:29:53.925031 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.425009503 +0000 UTC m=+221.342658347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.966885 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:53 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:53 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:53 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:53 crc kubenswrapper[4717]: I0308 05:29:53.966957 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.028575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.029016 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.529001536 +0000 UTC m=+221.446650380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.132224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.133143 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.633121882 +0000 UTC m=+221.550770726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.235596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.236068 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.736052477 +0000 UTC m=+221.653701321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.337437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.337642 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.837600716 +0000 UTC m=+221.755249560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.338149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.338520 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.838505619 +0000 UTC m=+221.756154463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.405436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" event={"ID":"efe1d476-e883-4331-a1ab-376353731509","Type":"ContainerStarted","Data":"99ed9d82dffba2d982cb289ef727d6e4398de1862b969551083047f34d452ace"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.406548 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.418779 4717 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mrdlw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.418851 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" podUID="efe1d476-e883-4331-a1ab-376353731509" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.439360 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.440570 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:54.940545121 +0000 UTC m=+221.858193965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.462842 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40780: no serving certificate available for the kubelet" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.463163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" event={"ID":"10744e37-3152-49e4-ac03-d7eaa6a3439e","Type":"ContainerStarted","Data":"d6dce157a521b6cfb9518b2a1e259f4499fb015e4a640da49bef6446f4f0edc9"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.463226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" event={"ID":"10744e37-3152-49e4-ac03-d7eaa6a3439e","Type":"ContainerStarted","Data":"0e3423b7c0dc4b5b096dbf4e937043eab8471610461464d64bd3b4f3e326feb6"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.472236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-smfpz" event={"ID":"915a9413-e8fc-4441-b80a-f2a24186ad76","Type":"ContainerStarted","Data":"b99c18247b080069df1e0f751bb0ecf67488eaae1db8117daa458949e79283fa"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.482351 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" event={"ID":"d726a0f6-4858-4e71-8513-75c63f0bfb8d","Type":"ContainerStarted","Data":"a12dd15e1eba15081cce5130ba9ebf02ea238898c5166dbd8a822001fafa2e50"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.493705 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" event={"ID":"311276df-2e61-4ff4-bfe0-1e6bf4e327dd","Type":"ContainerStarted","Data":"ce691a7de4afc5a023b1bb90a436c655ce0fe2cdf03b32141948a81be01e5057"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.493784 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" event={"ID":"311276df-2e61-4ff4-bfe0-1e6bf4e327dd","Type":"ContainerStarted","Data":"4257b542965a4588f818664a67779dd6c9aa15a5420172ad3119b4d88507046c"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.538166 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" event={"ID":"92f72a27-7281-40cc-89e0-e0424b81c21d","Type":"ContainerStarted","Data":"92cc7449415d2ea4b732e98a90a54f6b5d00f9f7266974b09d749f470c10cfb0"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.538222 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.541020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.541445 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.041429433 +0000 UTC m=+221.959078277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.555231 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" podStartSLOduration=161.555196601 podStartE2EDuration="2m41.555196601s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.543088376 +0000 UTC m=+221.460737220" watchObservedRunningTime="2026-03-08 05:29:54.555196601 +0000 UTC m=+221.472845435" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.569710 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" event={"ID":"91f38a82-66cb-4dbd-a270-4e2000042f25","Type":"ContainerStarted","Data":"0383ac6b5541ff46ab99b5903ee5ee0ed69551ba9058c4c11d4031182d0a72a9"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.569800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" event={"ID":"91f38a82-66cb-4dbd-a270-4e2000042f25","Type":"ContainerStarted","Data":"7884c929477479fd8f1a61c19fd15b77922993e37d3f78284385c74d0c0b28c5"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.577223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hgxsr" event={"ID":"6bb8daf7-8d77-44c2-ab01-b02257d17ac9","Type":"ContainerStarted","Data":"d74b01195b1588c9374271f684801f2b49b23f112cd7aae27e797833271c0557"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.586723 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-hgxsr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.586786 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hgxsr" podUID="6bb8daf7-8d77-44c2-ab01-b02257d17ac9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.604230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5d44b" event={"ID":"b9b12aad-95fa-4dc9-87b9-eb31c285f487","Type":"ContainerStarted","Data":"138dfd8db0060a075177fe0a81e75c2f3eb53c5ee39d4062c5d9c45b5a13046a"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.604834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5d44b" event={"ID":"b9b12aad-95fa-4dc9-87b9-eb31c285f487","Type":"ContainerStarted","Data":"37b177ec92170ba75dff8d9ebdd34586ec24763d5198def1b5069367cdf26217"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.614954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" event={"ID":"6102a82e-ea92-4f30-9eba-0cf4b19a3d87","Type":"ContainerStarted","Data":"64e00d5ea32bd51feee266a0ca8f563fa804d8f213827f179509f2dbc9e1bc13"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.625216 4717 generic.go:334] "Generic (PLEG): container finished" podID="73c6f272-9791-479e-8dde-b761d5da5b75" containerID="ece16f6eda2813b32b0f77e111cf8da4fc5928e077f544954ce6781a6b1a9f64" exitCode=0 Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.625297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" event={"ID":"73c6f272-9791-479e-8dde-b761d5da5b75","Type":"ContainerDied","Data":"ece16f6eda2813b32b0f77e111cf8da4fc5928e077f544954ce6781a6b1a9f64"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.633019 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ctr2b" podStartSLOduration=161.632994883 podStartE2EDuration="2m41.632994883s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.616648048 +0000 UTC m=+221.534296882" watchObservedRunningTime="2026-03-08 05:29:54.632994883 +0000 UTC m=+221.550643727" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.635503 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-d9cd6" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.642303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" event={"ID":"175ee8df-26ba-40d0-a30e-6f5bcb5435b7","Type":"ContainerStarted","Data":"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.642709 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.643016 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.142995643 +0000 UTC m=+222.060644487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.645224 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.648857 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.148847975 +0000 UTC m=+222.066496819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.652255 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.663164 4717 generic.go:334] "Generic (PLEG): container finished" podID="94a4bc63-d223-4305-abe4-a9a259db716d" containerID="82aa56d9a6d3a0b0653186e272e7b0132cc19cc40e8ee0f6460c4055a17b6728" exitCode=0 Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.663267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" event={"ID":"94a4bc63-d223-4305-abe4-a9a259db716d","Type":"ContainerDied","Data":"82aa56d9a6d3a0b0653186e272e7b0132cc19cc40e8ee0f6460c4055a17b6728"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.663312 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" event={"ID":"94a4bc63-d223-4305-abe4-a9a259db716d","Type":"ContainerStarted","Data":"9071c2883ceb46d3d9e75808cc5aad410d0c1f2bd199cbef8371db109209a326"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.664093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.665892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" event={"ID":"3cdfb7b8-6ad7-4243-b963-e96432a2500b","Type":"ContainerStarted","Data":"17d742f06d939a33294a3c1e8ac4e8bb3cd47628fdce179489129ab020a55a35"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.667256 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" event={"ID":"5287b01d-2b35-4de8-8a24-f8fa7e778bc0","Type":"ContainerStarted","Data":"71fa2cdc4a9a30c18599e18b5b2e8019c57a38985853421376e7f1bcd40203de"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.667852 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.687598 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40788: no serving certificate available for the kubelet" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.695614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mhq2" event={"ID":"13a322b9-ab5c-44e7-bcda-9b05a3ef2f16","Type":"ContainerStarted","Data":"fe29e1c831ba6a38a9a5cbdedb6d8f8444aae23562dd7474dfa13a65fb626892"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.705331 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" podStartSLOduration=161.705296332 podStartE2EDuration="2m41.705296332s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.665087757 +0000 UTC m=+221.582736601" watchObservedRunningTime="2026-03-08 05:29:54.705296332 +0000 UTC m=+221.622945186" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.725530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerStarted","Data":"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.725594 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerStarted","Data":"25c89bced4be891da572e03247861075e21bd75153e8f4ee70825b5f5930cefb"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.726833 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.734813 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhjjq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.734870 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.746438 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.747611 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.247589811 +0000 UTC m=+222.165238655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.753840 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dfjxc" podStartSLOduration=161.753819403 podStartE2EDuration="2m41.753819403s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.707008366 +0000 UTC m=+221.624657210" watchObservedRunningTime="2026-03-08 05:29:54.753819403 +0000 UTC m=+221.671468247" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.761874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" event={"ID":"bb782ddc-5c1b-4352-9c44-d8ada04559e0","Type":"ContainerStarted","Data":"dc662c04d65427430690503784b6f7e0ab8c357833138b3b92ceecebcad98a7a"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.766051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" event={"ID":"da83d24b-fb47-44b9-a05e-228eabe397cf","Type":"ContainerStarted","Data":"10c91eb7ceb9954f58af7155495ffdc92f9e0ce43fa023a49ee709e7eb89a677"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.766146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" event={"ID":"da83d24b-fb47-44b9-a05e-228eabe397cf","Type":"ContainerStarted","Data":"c1a7f987a888687725d237dace01d3a84f47f4e1844d05c5c65975d1299a3570"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.774495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549128-gnphx" event={"ID":"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2","Type":"ContainerStarted","Data":"6352e5925ab5f2cdf78caa0c4beb3e78229b476ba5695da0e33310965d3e04f9"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.799896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" event={"ID":"731d619c-5e97-4335-9d85-008faeb45d03","Type":"ContainerStarted","Data":"fcf52d18cc5cacbbc7dec36a836096219d20d90a7a5e262937db984bd4cc2e30"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.818012 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40792: no serving certificate available for the kubelet" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.833458 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jddpv" podStartSLOduration=161.833428442 podStartE2EDuration="2m41.833428442s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.757674893 +0000 UTC m=+221.675323737" watchObservedRunningTime="2026-03-08 05:29:54.833428442 +0000 UTC m=+221.751077286" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.842095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" event={"ID":"11bd956f-1b8e-461d-b42b-50f1b7417607","Type":"ContainerStarted","Data":"40f8e980bd86b12240381cf376640e0d78baf2fcffa3769e90f24bcf8b631bd0"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.845078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" event={"ID":"0b37d34e-7f32-4fe6-b26d-09e780f37d86","Type":"ContainerStarted","Data":"bf8b775a35d2dd1ff1813cedf93fcf6e4a29659d61589fd28afc85bec9c22791"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.848577 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.849985 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.349968922 +0000 UTC m=+222.267617766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.856965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" event={"ID":"efbfbedd-eea1-4275-b62f-d70bdec887a4","Type":"ContainerStarted","Data":"befdeaa43e02095cc5fc0e012bcaffcd19a5080ba016ca9f986b6fe8df7513d9"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.897074 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.917820 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.918020 4717 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cw7dl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.918147 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" podUID="11bd956f-1b8e-461d-b42b-50f1b7417607" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.949952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:54 crc kubenswrapper[4717]: E0308 05:29:54.952351 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.452310031 +0000 UTC m=+222.369958875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.965130 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40802: no serving certificate available for the kubelet" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.968106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" event={"ID":"0507da4e-a2d5-43c2-b5e2-25f42085431c","Type":"ContainerStarted","Data":"462db2256290eb7fadfba8aaaba5795f2948f808f256cb70ad2bd60eb3d50de4"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.968165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" event={"ID":"0507da4e-a2d5-43c2-b5e2-25f42085431c","Type":"ContainerStarted","Data":"02c5275786a2bbe84d54640bf3b29f8fe9e01922c1e7d8a17ad2f300301cba09"} Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.977240 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:54 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:54 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:54 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.977326 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:54 crc kubenswrapper[4717]: I0308 05:29:54.988340 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5d44b" podStartSLOduration=7.988312587 podStartE2EDuration="7.988312587s" podCreationTimestamp="2026-03-08 05:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:54.9149578 +0000 UTC m=+221.832606644" watchObservedRunningTime="2026-03-08 05:29:54.988312587 +0000 UTC m=+221.905961431" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.004834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" event={"ID":"141bf3de-a931-4d7d-9957-34f4c180819a","Type":"ContainerStarted","Data":"e7d55f2597eb7b0611d286da1969deb6dde28933f4fca5f5d6fcb10b5eec4bed"} Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.044991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" event={"ID":"f2c2b864-898a-4b84-a3ab-168051c21e34","Type":"ContainerStarted","Data":"14ab2dfd8d56364cd194906aab5c17a9e82279a1a71f33a3a84a4d23191201e9"} Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.053405 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hp5l5" podStartSLOduration=162.053389728 podStartE2EDuration="2m42.053389728s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.051024397 +0000 UTC m=+221.968673241" watchObservedRunningTime="2026-03-08 05:29:55.053389728 +0000 UTC m=+221.971038572" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.053722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.054174 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.554158278 +0000 UTC m=+222.471807122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.055518 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" podStartSLOduration=162.055510273 podStartE2EDuration="2m42.055510273s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.015120744 +0000 UTC m=+221.932769588" watchObservedRunningTime="2026-03-08 05:29:55.055510273 +0000 UTC m=+221.973159117" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.084068 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hd2k4" event={"ID":"1e41d744-e947-4c09-be5d-343f5a6d2bd1","Type":"ContainerStarted","Data":"28514e2b63849ede1b8f8af79ab5460efed85ec3e5d1baf6e5080ccb0c492097"} Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.093011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvb9t" event={"ID":"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6","Type":"ContainerStarted","Data":"a28fb19b8987295deea2504949ffa9f43aad40c5daea004d95ee5af5e226ca91"} Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.094166 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lvb9t" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.104618 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40806: no serving certificate available for the kubelet" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.114721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6jlgw" event={"ID":"113c7a12-4f47-40ba-be5e-abf62359ffe3","Type":"ContainerStarted","Data":"0cb4440d83643867ea3f181f1bc97d3d1ec6d36f8d3a706869fec0c92f7280e8"} Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.129177 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5fgj8" podStartSLOduration=162.129151077 podStartE2EDuration="2m42.129151077s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.113066369 +0000 UTC m=+222.030715213" watchObservedRunningTime="2026-03-08 05:29:55.129151077 +0000 UTC m=+222.046799921" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.167762 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.172315 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.672295418 +0000 UTC m=+222.589944262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.219741 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podStartSLOduration=162.219712281 podStartE2EDuration="2m42.219712281s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.167182995 +0000 UTC m=+222.084831839" watchObservedRunningTime="2026-03-08 05:29:55.219712281 +0000 UTC m=+222.137361115" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.233309 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40816: no serving certificate available for the kubelet" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.233344 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" podStartSLOduration=162.233303744 podStartE2EDuration="2m42.233303744s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.224813593 +0000 UTC m=+222.142462437" watchObservedRunningTime="2026-03-08 05:29:55.233303744 +0000 UTC m=+222.150952588" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.272367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.277535 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.777522663 +0000 UTC m=+222.695171507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.313583 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xbmwz" podStartSLOduration=162.313554609 podStartE2EDuration="2m42.313554609s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.261963279 +0000 UTC m=+222.179612123" watchObservedRunningTime="2026-03-08 05:29:55.313554609 +0000 UTC m=+222.231203453" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.359802 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrbvv" podStartSLOduration=162.359779941 podStartE2EDuration="2m42.359779941s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.314870574 +0000 UTC m=+222.232519418" watchObservedRunningTime="2026-03-08 05:29:55.359779941 +0000 UTC m=+222.277428785" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.360871 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" podStartSLOduration=162.360866579 podStartE2EDuration="2m42.360866579s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.357267876 +0000 UTC m=+222.274916720" watchObservedRunningTime="2026-03-08 05:29:55.360866579 +0000 UTC m=+222.278515423" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.373017 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40820: no serving certificate available for the kubelet" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.373542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.374021 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.87399986 +0000 UTC m=+222.791648694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.445699 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lvb9t" podStartSLOduration=8.445654843 podStartE2EDuration="8.445654843s" podCreationTimestamp="2026-03-08 05:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.401444984 +0000 UTC m=+222.319093828" watchObservedRunningTime="2026-03-08 05:29:55.445654843 +0000 UTC m=+222.363303687" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.462181 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40832: no serving certificate available for the kubelet" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.476590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.477020 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:55.977004797 +0000 UTC m=+222.894653631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.479571 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" podStartSLOduration=163.479554894 podStartE2EDuration="2m43.479554894s" podCreationTimestamp="2026-03-08 05:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.447173262 +0000 UTC m=+222.364822106" watchObservedRunningTime="2026-03-08 05:29:55.479554894 +0000 UTC m=+222.397203738" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.482314 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" podStartSLOduration=162.482307415 podStartE2EDuration="2m42.482307415s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.4774844 +0000 UTC m=+222.395133244" watchObservedRunningTime="2026-03-08 05:29:55.482307415 +0000 UTC m=+222.399956259" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.537306 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hhrcp" podStartSLOduration=162.537278284 podStartE2EDuration="2m42.537278284s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.533753162 +0000 UTC m=+222.451402006" watchObservedRunningTime="2026-03-08 05:29:55.537278284 +0000 UTC m=+222.454927128" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.577473 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.577884 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.077863848 +0000 UTC m=+222.995512693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.612476 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c5mwq" podStartSLOduration=162.612448587 podStartE2EDuration="2m42.612448587s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.606486742 +0000 UTC m=+222.524135586" watchObservedRunningTime="2026-03-08 05:29:55.612448587 +0000 UTC m=+222.530097431" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.670883 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-jvkgl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.670977 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" podUID="5287b01d-2b35-4de8-8a24-f8fa7e778bc0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.677580 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8lcq6" podStartSLOduration=162.677562959 podStartE2EDuration="2m42.677562959s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:55.676618805 +0000 UTC m=+222.594267639" watchObservedRunningTime="2026-03-08 05:29:55.677562959 +0000 UTC m=+222.595211803" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.680619 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.681236 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.181218014 +0000 UTC m=+223.098866868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.782350 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.783548 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.283522193 +0000 UTC m=+223.201171037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.884701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.885186 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.385169715 +0000 UTC m=+223.302818559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.957226 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:55 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:55 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:55 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.957307 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.985645 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.987571 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.487525825 +0000 UTC m=+223.405174669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:55 crc kubenswrapper[4717]: I0308 05:29:55.987622 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:55 crc kubenswrapper[4717]: E0308 05:29:55.988168 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.488146881 +0000 UTC m=+223.405795725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:55.995511 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.003105 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.007026 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.036328 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.092535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.092872 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4gn\" (UniqueName: \"kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.092910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.092998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.093120 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.593097709 +0000 UTC m=+223.510746553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.133184 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.148514 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.148724 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.153900 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.158771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvb9t" event={"ID":"b9f2cba0-2a4b-442e-aa97-aed7413ed0f6","Type":"ContainerStarted","Data":"cc74ff046d05a14cc9037b832cf83d0dda06aea3fae70d0fafb84500e93749a7"} Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.181369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" event={"ID":"efbfbedd-eea1-4275-b62f-d70bdec887a4","Type":"ContainerStarted","Data":"9e00704b802e0e86be93b433252fa1b749bb764dce930501c10ff50646caecfa"} Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.194035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.194087 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4gn\" (UniqueName: \"kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.194112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.194139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.194542 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.694527775 +0000 UTC m=+223.612176609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.195353 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" event={"ID":"11bd956f-1b8e-461d-b42b-50f1b7417607","Type":"ContainerStarted","Data":"ae04eed59eaa8ce1074223fc5fa01520ce82e164c21086893b43167deab3ebd2"} Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.195776 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.196229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.213810 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40834: no serving certificate available for the kubelet" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.220773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smkdd" event={"ID":"141bf3de-a931-4d7d-9957-34f4c180819a","Type":"ContainerStarted","Data":"962de73f8b6804954fe34514cb3261001bab1aed8bc8fb451ff0b11362c5e6cd"} Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.239913 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4gn\" (UniqueName: \"kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn\") pod \"community-operators-hgbcl\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.248874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" event={"ID":"73c6f272-9791-479e-8dde-b761d5da5b75","Type":"ContainerStarted","Data":"cf403f409ef6eb26c0d1ece4fb901599b80d578cb35c261bb0e22ad49e6de71c"} Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.257023 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhjjq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.257100 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.258898 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-hgxsr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.258985 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hgxsr" podUID="6bb8daf7-8d77-44c2-ab01-b02257d17ac9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.272415 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jvkgl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.296348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.296642 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.296740 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhqmq\" (UniqueName: \"kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.296833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.309360 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.809325938 +0000 UTC m=+223.726974782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.313332 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrdlw" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.329895 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" podStartSLOduration=163.329865181 podStartE2EDuration="2m43.329865181s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:56.294826191 +0000 UTC m=+223.212475035" watchObservedRunningTime="2026-03-08 05:29:56.329865181 +0000 UTC m=+223.247514025" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.332804 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.356765 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.358331 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.400784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.400876 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.401863 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.417376 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.417959 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhqmq\" (UniqueName: \"kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.418268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.418326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.418716 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.420997 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.422163 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:56.922149019 +0000 UTC m=+223.839797863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.429834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.438162 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.445515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.548987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhqmq\" (UniqueName: \"kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq\") pod \"certified-operators-lmkvf\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564416 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564859 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564916 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprdm\" (UniqueName: \"kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564975 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.564997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.565221 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.065203857 +0000 UTC m=+223.982852701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.651366 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.653248 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.653518 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666434 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprdm\" (UniqueName: \"kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666586 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.666612 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.666958 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.166942131 +0000 UTC m=+224.084590975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.667097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.667469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.739701 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.744989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprdm\" (UniqueName: \"kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm\") pod \"community-operators-n7sb6\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.769292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.769434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.769459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79qz\" (UniqueName: \"kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.769503 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.769626 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.269603579 +0000 UTC m=+224.187252423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.772518 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.796857 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.871304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.871368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79qz\" (UniqueName: \"kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.871396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.871436 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.872511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.873478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.873826 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.373809977 +0000 UTC m=+224.291458821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.919790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79qz\" (UniqueName: \"kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz\") pod \"certified-operators-sqqgh\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.955034 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.961900 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:56 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:56 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:56 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.961982 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.972533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:56 crc kubenswrapper[4717]: E0308 05:29:56.973007 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.472985275 +0000 UTC m=+224.390634119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:56 crc kubenswrapper[4717]: I0308 05:29:56.995979 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.058097 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.077644 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.078108 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.578093076 +0000 UTC m=+224.495741920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.190329 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.190590 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.690541249 +0000 UTC m=+224.608190093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.190750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.191159 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.691143794 +0000 UTC m=+224.608792638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.285613 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerStarted","Data":"4ed1ed426811b36bacde11eef678ee3e15e605e3a7160e65441ee43448aaa5f8"} Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.291858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" event={"ID":"efbfbedd-eea1-4275-b62f-d70bdec887a4","Type":"ContainerStarted","Data":"1b1887f7e5fffbd2ef1b9415588268cfc0a800a34a9c7b9b134e55049b587150"} Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.292755 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.293074 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.793044283 +0000 UTC m=+224.710693127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.294189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.295452 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.795435975 +0000 UTC m=+224.713084819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.296292 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhjjq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.296327 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.323773 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kvb2r" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.396874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.397276 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:57.897253761 +0000 UTC m=+224.814902605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.499374 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.504662 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.004640392 +0000 UTC m=+224.922289236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.548772 4717 ???:1] "http: TLS handshake error from 192.168.126.11:40844: no serving certificate available for the kubelet" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.560745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.584648 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.584969 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" containerName="controller-manager" containerID="cri-o://4f7178bea61a5d816f4de80ed570964febc8c0976cf1b97c4e5e81ed0b42bf1d" gracePeriod=30 Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.608327 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.608653 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.108629804 +0000 UTC m=+225.026278648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.665230 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.710091 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.710467 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.21045487 +0000 UTC m=+225.128103714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.719087 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.737411 4717 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.811481 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.811924 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.311901617 +0000 UTC m=+225.229550461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.815129 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.917055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:57 crc kubenswrapper[4717]: E0308 05:29:57.917533 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.417511392 +0000 UTC m=+225.335160246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rmg8f" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.949869 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.955976 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:57 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:57 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:57 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.956045 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:57 crc kubenswrapper[4717]: I0308 05:29:57.987908 4717 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-08T05:29:57.737451972Z","Handler":null,"Name":""} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.017962 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:58 crc kubenswrapper[4717]: E0308 05:29:58.018397 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 05:29:58.518374723 +0000 UTC m=+225.436023567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.063974 4717 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.077384 4717 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.127183 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.131339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.138368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.140744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.141252 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.148469 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.148512 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.193373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rmg8f\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.229319 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.234374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.297286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"db7dece4-2341-4131-bf36-c20321eb8900","Type":"ContainerStarted","Data":"2f7020cb876cf867e9f7a5ef4922b3a68abd2b696c39df542b77a5ae7a7606e6"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.301053 4717 generic.go:334] "Generic (PLEG): container finished" podID="d612266d-387c-4561-a50f-02cd3cced887" containerID="c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e" exitCode=0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.301121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerDied","Data":"c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.303110 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.303546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerStarted","Data":"05fb64367a631eaefb89746bef48fa016400cdbb024195db65fb0616751c59ef"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.310389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" event={"ID":"efbfbedd-eea1-4275-b62f-d70bdec887a4","Type":"ContainerStarted","Data":"454b57360dd6d160c57b7a28999c9a8a55a31b57120f0b507b2eecf53bc86b30"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.310437 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" event={"ID":"efbfbedd-eea1-4275-b62f-d70bdec887a4","Type":"ContainerStarted","Data":"2c3cb9dc9c97fcf5f6e88c4e07d9f32a80ea84623fdf938fdbe1b09975d702ff"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.312864 4717 generic.go:334] "Generic (PLEG): container finished" podID="724eb749-c200-4929-9a63-f3384e410a6f" containerID="624c4a7f503af7d7628529821ea94445c058fd15c3228226aabf8f4d56e7bb88" exitCode=0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.312954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerDied","Data":"624c4a7f503af7d7628529821ea94445c058fd15c3228226aabf8f4d56e7bb88"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.312993 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerStarted","Data":"1fc7fc6f8869cf3aa98971a73536a89069ae40d88c877ee42f3b36f635cb8b77"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.318921 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b2daede-7003-4bcc-9a92-6342eb319181" containerID="4f7178bea61a5d816f4de80ed570964febc8c0976cf1b97c4e5e81ed0b42bf1d" exitCode=0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.318987 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" event={"ID":"3b2daede-7003-4bcc-9a92-6342eb319181","Type":"ContainerDied","Data":"4f7178bea61a5d816f4de80ed570964febc8c0976cf1b97c4e5e81ed0b42bf1d"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.319015 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" event={"ID":"3b2daede-7003-4bcc-9a92-6342eb319181","Type":"ContainerDied","Data":"68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.319030 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d7f54ba2c53afa8fdee28db793c495a94083e85a76fcd822b25f8697208200" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.322925 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.329467 4717 generic.go:334] "Generic (PLEG): container finished" podID="5961d211-7900-41ef-9915-d935e9cec42a" containerID="1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc" exitCode=0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.329519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerDied","Data":"1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.329595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerStarted","Data":"83bc79adce75aeb898d839d8b796b43450c206a879b1d43cd6187f05ab1e004c"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.330968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2w7\" (UniqueName: \"kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.330994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.331018 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.332411 4717 generic.go:334] "Generic (PLEG): container finished" podID="da83d24b-fb47-44b9-a05e-228eabe397cf" containerID="10c91eb7ceb9954f58af7155495ffdc92f9e0ce43fa023a49ee709e7eb89a677" exitCode=0 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.333058 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" event={"ID":"da83d24b-fb47-44b9-a05e-228eabe397cf","Type":"ContainerDied","Data":"10c91eb7ceb9954f58af7155495ffdc92f9e0ce43fa023a49ee709e7eb89a677"} Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.333554 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerName="route-controller-manager" containerID="cri-o://d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239" gracePeriod=30 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.348821 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-szjtn" podStartSLOduration=11.34878175 podStartE2EDuration="11.34878175s" podCreationTimestamp="2026-03-08 05:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:58.344418466 +0000 UTC m=+225.262067320" watchObservedRunningTime="2026-03-08 05:29:58.34878175 +0000 UTC m=+225.266430594" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.431459 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert\") pod \"3b2daede-7003-4bcc-9a92-6342eb319181\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445089 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config\") pod \"3b2daede-7003-4bcc-9a92-6342eb319181\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445195 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles\") pod \"3b2daede-7003-4bcc-9a92-6342eb319181\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca\") pod \"3b2daede-7003-4bcc-9a92-6342eb319181\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzvzc\" (UniqueName: \"kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc\") pod \"3b2daede-7003-4bcc-9a92-6342eb319181\" (UID: \"3b2daede-7003-4bcc-9a92-6342eb319181\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445692 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2w7\" (UniqueName: \"kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.445717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.446942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.447387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b2daede-7003-4bcc-9a92-6342eb319181" (UID: "3b2daede-7003-4bcc-9a92-6342eb319181"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.447761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.448577 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3b2daede-7003-4bcc-9a92-6342eb319181" (UID: "3b2daede-7003-4bcc-9a92-6342eb319181"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.448657 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config" (OuterVolumeSpecName: "config") pod "3b2daede-7003-4bcc-9a92-6342eb319181" (UID: "3b2daede-7003-4bcc-9a92-6342eb319181"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.449126 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b2daede-7003-4bcc-9a92-6342eb319181" (UID: "3b2daede-7003-4bcc-9a92-6342eb319181"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.453859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc" (OuterVolumeSpecName: "kube-api-access-fzvzc") pod "3b2daede-7003-4bcc-9a92-6342eb319181" (UID: "3b2daede-7003-4bcc-9a92-6342eb319181"). InnerVolumeSpecName "kube-api-access-fzvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.473445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2w7\" (UniqueName: \"kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7\") pod \"redhat-marketplace-x7t8t\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.517388 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:29:58 crc kubenswrapper[4717]: E0308 05:29:58.517707 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" containerName="controller-manager" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.517722 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" containerName="controller-manager" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.517837 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" containerName="controller-manager" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.518667 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.535917 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547639 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9zzs\" (UniqueName: \"kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547868 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547883 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547893 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzvzc\" (UniqueName: \"kubernetes.io/projected/3b2daede-7003-4bcc-9a92-6342eb319181-kube-api-access-fzvzc\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547906 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2daede-7003-4bcc-9a92-6342eb319181-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.547916 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2daede-7003-4bcc-9a92-6342eb319181-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.586856 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.649486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9zzs\" (UniqueName: \"kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.649592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.649625 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.650186 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.650446 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.675166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9zzs\" (UniqueName: \"kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs\") pod \"redhat-marketplace-cdf65\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.706779 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.750904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config\") pod \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.751103 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m98fp\" (UniqueName: \"kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp\") pod \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.751144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert\") pod \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.751190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca\") pod \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\" (UID: \"175ee8df-26ba-40d0-a30e-6f5bcb5435b7\") " Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.752793 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config" (OuterVolumeSpecName: "config") pod "175ee8df-26ba-40d0-a30e-6f5bcb5435b7" (UID: "175ee8df-26ba-40d0-a30e-6f5bcb5435b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.752819 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "175ee8df-26ba-40d0-a30e-6f5bcb5435b7" (UID: "175ee8df-26ba-40d0-a30e-6f5bcb5435b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.759366 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp" (OuterVolumeSpecName: "kube-api-access-m98fp") pod "175ee8df-26ba-40d0-a30e-6f5bcb5435b7" (UID: "175ee8df-26ba-40d0-a30e-6f5bcb5435b7"). InnerVolumeSpecName "kube-api-access-m98fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.761109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "175ee8df-26ba-40d0-a30e-6f5bcb5435b7" (UID: "175ee8df-26ba-40d0-a30e-6f5bcb5435b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.820455 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.834136 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:29:58 crc kubenswrapper[4717]: W0308 05:29:58.849117 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce686db_32d9_41b7_80fa_124e094dc4e8.slice/crio-1886c95cd0a6853721a991d4ebfb850adc71bfa0062bd6590a7da99292c838f2 WatchSource:0}: Error finding container 1886c95cd0a6853721a991d4ebfb850adc71bfa0062bd6590a7da99292c838f2: Status 404 returned error can't find the container with id 1886c95cd0a6853721a991d4ebfb850adc71bfa0062bd6590a7da99292c838f2 Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.852303 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m98fp\" (UniqueName: \"kubernetes.io/projected/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-kube-api-access-m98fp\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.852356 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.852385 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.852397 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175ee8df-26ba-40d0-a30e-6f5bcb5435b7-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.900362 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.957707 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:58 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:58 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:58 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:58 crc kubenswrapper[4717]: I0308 05:29:58.959324 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.121115 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:29:59 crc kubenswrapper[4717]: E0308 05:29:59.121396 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerName="route-controller-manager" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.121413 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerName="route-controller-manager" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.121512 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerName="route-controller-manager" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.122257 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.124221 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.138065 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.157152 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.157242 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.157336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcwp\" (UniqueName: \"kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.178708 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:29:59 crc kubenswrapper[4717]: W0308 05:29:59.198730 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6549f4_05b3_4309_b7f4_3b34fe523413.slice/crio-25f59b673485b2e88ed5ca9ba168a3145b3fe2694c8f452a945cc456c9f39c0d WatchSource:0}: Error finding container 25f59b673485b2e88ed5ca9ba168a3145b3fe2694c8f452a945cc456c9f39c0d: Status 404 returned error can't find the container with id 25f59b673485b2e88ed5ca9ba168a3145b3fe2694c8f452a945cc456c9f39c0d Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.259167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.259546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.259571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llcwp\" (UniqueName: \"kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.260441 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.260655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.279810 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.280964 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.283956 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.287258 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.291886 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.291955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcwp\" (UniqueName: \"kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp\") pod \"redhat-operators-n69hm\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.294733 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.347945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerStarted","Data":"25f59b673485b2e88ed5ca9ba168a3145b3fe2694c8f452a945cc456c9f39c0d"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.353174 4717 generic.go:334] "Generic (PLEG): container finished" podID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerID="5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283" exitCode=0 Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.353266 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerDied","Data":"5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.353293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerStarted","Data":"1886c95cd0a6853721a991d4ebfb850adc71bfa0062bd6590a7da99292c838f2"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.369908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.369995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjr5j\" (UniqueName: \"kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370089 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370283 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370430 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370556 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.370698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.371180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxp65\" (UniqueName: \"kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.378371 4717 generic.go:334] "Generic (PLEG): container finished" podID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" containerID="d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239" exitCode=0 Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.378458 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" event={"ID":"175ee8df-26ba-40d0-a30e-6f5bcb5435b7","Type":"ContainerDied","Data":"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.378493 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.378532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6" event={"ID":"175ee8df-26ba-40d0-a30e-6f5bcb5435b7","Type":"ContainerDied","Data":"2aca45b5dee02b883a8261f3ee4111158ddd859e5786a070826c4056a1364656"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.378558 4717 scope.go:117] "RemoveContainer" containerID="d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.385041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" event={"ID":"d7acdc7a-9697-4daf-9b82-253fcb5f1c55","Type":"ContainerStarted","Data":"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.385080 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" event={"ID":"d7acdc7a-9697-4daf-9b82-253fcb5f1c55","Type":"ContainerStarted","Data":"4af17d465547681ccefd9c4f90db8a92824296b843bbcb22106efad5468a35b0"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.385123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.387348 4717 generic.go:334] "Generic (PLEG): container finished" podID="db7dece4-2341-4131-bf36-c20321eb8900" containerID="ff1050afd27b73773cffbbc4dae32211d7012680eab99a509473508913bdcda5" exitCode=0 Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.387435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"db7dece4-2341-4131-bf36-c20321eb8900","Type":"ContainerDied","Data":"ff1050afd27b73773cffbbc4dae32211d7012680eab99a509473508913bdcda5"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.391382 4717 generic.go:334] "Generic (PLEG): container finished" podID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerID="5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30" exitCode=0 Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.391558 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerDied","Data":"5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30"} Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.391890 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hwxhw" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.420221 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" podStartSLOduration=166.420185284 podStartE2EDuration="2m46.420185284s" podCreationTimestamp="2026-03-08 05:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:29:59.405281016 +0000 UTC m=+226.322929870" watchObservedRunningTime="2026-03-08 05:29:59.420185284 +0000 UTC m=+226.337834128" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.436463 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.440520 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-z2rl6"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.443216 4717 scope.go:117] "RemoveContainer" containerID="d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239" Mar 08 05:29:59 crc kubenswrapper[4717]: E0308 05:29:59.448569 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239\": container with ID starting with d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239 not found: ID does not exist" containerID="d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.448617 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239"} err="failed to get container status \"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239\": rpc error: code = NotFound desc = could not find container \"d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239\": container with ID starting with d7749bf9f00758308a3c9301baca20ac0b5177c0050843228294834a939c7239 not found: ID does not exist" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.467123 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472526 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472551 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxp65\" (UniqueName: \"kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472624 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjr5j\" (UniqueName: \"kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.472749 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.474089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.474752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.475133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.476629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.479445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.484418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.486157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.493454 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjr5j\" (UniqueName: \"kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j\") pod \"route-controller-manager-8647d5c774-567mc\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.493530 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.496822 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hwxhw"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.497466 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxp65\" (UniqueName: \"kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65\") pod \"controller-manager-6f56df4dc7-t7mqb\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.515760 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.517096 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.531003 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.676641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjbx5\" (UniqueName: \"kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.680096 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.680207 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.721744 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.736905 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.773000 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.774014 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.774670 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.774853 4717 patch_prober.go:28] interesting pod/console-f9d7485db-zhsbw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.774924 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zhsbw" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.781462 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.781520 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.781544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjbx5\" (UniqueName: \"kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.782161 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.782219 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.803306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjbx5\" (UniqueName: \"kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5\") pod \"redhat-operators-w672l\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.849550 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.856878 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175ee8df-26ba-40d0-a30e-6f5bcb5435b7" path="/var/lib/kubelet/pods/175ee8df-26ba-40d0-a30e-6f5bcb5435b7/volumes" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.857996 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2daede-7003-4bcc-9a92-6342eb319181" path="/var/lib/kubelet/pods/3b2daede-7003-4bcc-9a92-6342eb319181/volumes" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.858675 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.882916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume\") pod \"da83d24b-fb47-44b9-a05e-228eabe397cf\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.882966 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume\") pod \"da83d24b-fb47-44b9-a05e-228eabe397cf\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.882994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqftc\" (UniqueName: \"kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc\") pod \"da83d24b-fb47-44b9-a05e-228eabe397cf\" (UID: \"da83d24b-fb47-44b9-a05e-228eabe397cf\") " Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.893971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.898641 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da83d24b-fb47-44b9-a05e-228eabe397cf" (UID: "da83d24b-fb47-44b9-a05e-228eabe397cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.899430 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc" (OuterVolumeSpecName: "kube-api-access-gqftc") pod "da83d24b-fb47-44b9-a05e-228eabe397cf" (UID: "da83d24b-fb47-44b9-a05e-228eabe397cf"). InnerVolumeSpecName "kube-api-access-gqftc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.902758 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cw7dl" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.904009 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "da83d24b-fb47-44b9-a05e-228eabe397cf" (UID: "da83d24b-fb47-44b9-a05e-228eabe397cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.949469 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.983947 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:29:59 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:29:59 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:29:59 crc kubenswrapper[4717]: healthz check failed Mar 08 05:29:59 crc kubenswrapper[4717]: I0308 05:29:59.984013 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:29:59.998500 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da83d24b-fb47-44b9-a05e-228eabe397cf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:29:59.998525 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da83d24b-fb47-44b9-a05e-228eabe397cf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:29:59.998538 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqftc\" (UniqueName: \"kubernetes.io/projected/da83d24b-fb47-44b9-a05e-228eabe397cf-kube-api-access-gqftc\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.036808 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:30:00 crc kubenswrapper[4717]: W0308 05:30:00.096843 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06f4ab9f_48eb_410c_8915_c47c5cff1650.slice/crio-642ca03781ed5fa615f4e743f37ba682ae4a27dad337b1ad959d55a98107dfb3 WatchSource:0}: Error finding container 642ca03781ed5fa615f4e743f37ba682ae4a27dad337b1ad959d55a98107dfb3: Status 404 returned error can't find the container with id 642ca03781ed5fa615f4e743f37ba682ae4a27dad337b1ad959d55a98107dfb3 Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.202783 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549130-sbndg"] Mar 08 05:30:00 crc kubenswrapper[4717]: E0308 05:30:00.203244 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83d24b-fb47-44b9-a05e-228eabe397cf" containerName="collect-profiles" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.203263 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83d24b-fb47-44b9-a05e-228eabe397cf" containerName="collect-profiles" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.203391 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="da83d24b-fb47-44b9-a05e-228eabe397cf" containerName="collect-profiles" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.203955 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.218551 4717 ???:1] "http: TLS handshake error from 192.168.126.11:35364: no serving certificate available for the kubelet" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.218952 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.230399 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.237880 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.239036 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.247461 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.247539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.247555 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549130-sbndg"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.298589 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.299006 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.305436 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.311385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfjg4\" (UniqueName: \"kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4\") pod \"auto-csr-approver-29549130-sbndg\" (UID: \"662f5bb0-c453-44f7-944a-5e39e1a580e9\") " pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.412382 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.412443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfjg4\" (UniqueName: \"kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4\") pod \"auto-csr-approver-29549130-sbndg\" (UID: \"662f5bb0-c453-44f7-944a-5e39e1a580e9\") " pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.412467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.412524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gjw6\" (UniqueName: \"kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.419492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerStarted","Data":"642ca03781ed5fa615f4e743f37ba682ae4a27dad337b1ad959d55a98107dfb3"} Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.431138 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerID="177d6bd85165a28b2665e2355cfa761222f200bd9e49303f1f61f66f301ba8b9" exitCode=0 Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.431208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerDied","Data":"177d6bd85165a28b2665e2355cfa761222f200bd9e49303f1f61f66f301ba8b9"} Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.433104 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfjg4\" (UniqueName: \"kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4\") pod \"auto-csr-approver-29549130-sbndg\" (UID: \"662f5bb0-c453-44f7-944a-5e39e1a580e9\") " pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.443489 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549115-zkqg9" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.445856 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1a7f987a888687725d237dace01d3a84f47f4e1844d05c5c65975d1299a3570" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.470427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ct6fm" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.510543 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.516214 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.516286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gjw6\" (UniqueName: \"kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.516342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.517301 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.517524 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.542776 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.544326 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.554969 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-hgxsr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.555396 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hgxsr" podUID="6bb8daf7-8d77-44c2-ab01-b02257d17ac9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.556464 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-hgxsr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.556538 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hgxsr" podUID="6bb8daf7-8d77-44c2-ab01-b02257d17ac9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.583888 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gjw6\" (UniqueName: \"kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6\") pod \"collect-profiles-29549130-zppk4\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.695485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.698932 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.699790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.705037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.711046 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.711164 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.711939 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.827663 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.827761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.930623 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.930698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.930842 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.952127 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.954782 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:00 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:00 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:00 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:00 crc kubenswrapper[4717]: I0308 05:30:00.955050 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.010328 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.048049 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.117826 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.239501 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access\") pod \"db7dece4-2341-4131-bf36-c20321eb8900\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.241101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir\") pod \"db7dece4-2341-4131-bf36-c20321eb8900\" (UID: \"db7dece4-2341-4131-bf36-c20321eb8900\") " Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.241401 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "db7dece4-2341-4131-bf36-c20321eb8900" (UID: "db7dece4-2341-4131-bf36-c20321eb8900"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.255903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "db7dece4-2341-4131-bf36-c20321eb8900" (UID: "db7dece4-2341-4131-bf36-c20321eb8900"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.350285 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db7dece4-2341-4131-bf36-c20321eb8900-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.350325 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db7dece4-2341-4131-bf36-c20321eb8900-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.482016 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" event={"ID":"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf","Type":"ContainerStarted","Data":"6c058a4a5b87fb10fb9612ca53c9b072f8dc3555475001fdefdb835619e0db46"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.482423 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" event={"ID":"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf","Type":"ContainerStarted","Data":"7e30d83b8705c6cb9629cb0e46c8ca316e5a62921efcfecd91cdb6b6d59a65aa"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.482922 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.508694 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549130-sbndg"] Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.511138 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" event={"ID":"050a2941-72c2-4173-9305-55c6d38e6d78","Type":"ContainerStarted","Data":"7589483e111792175409aed1c60a96a8d290b0ccfd5e80c8d6068b13288597c6"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.511202 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" event={"ID":"050a2941-72c2-4173-9305-55c6d38e6d78","Type":"ContainerStarted","Data":"7739b6a7f5315d7b1e8fc046491296a5e52c10b88a3f9f1ba70bb63ebc4ab1b9"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.512494 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.519586 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad20120b-f363-485e-a130-c9e49e4605c4" containerID="356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633" exitCode=0 Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.519744 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerDied","Data":"356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.519782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerStarted","Data":"1689181308d709a54903c6962c992831090073f655a45e37ce92827318fb781f"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.525229 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" podStartSLOduration=4.525199169 podStartE2EDuration="4.525199169s" podCreationTimestamp="2026-03-08 05:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:01.522232932 +0000 UTC m=+228.439881776" watchObservedRunningTime="2026-03-08 05:30:01.525199169 +0000 UTC m=+228.442848013" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.534718 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.535550 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:30:01 crc kubenswrapper[4717]: W0308 05:30:01.562815 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod662f5bb0_c453_44f7_944a_5e39e1a580e9.slice/crio-bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe WatchSource:0}: Error finding container bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe: Status 404 returned error can't find the container with id bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.576976 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" podStartSLOduration=4.576950424 podStartE2EDuration="4.576950424s" podCreationTimestamp="2026-03-08 05:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:01.568211067 +0000 UTC m=+228.485859911" watchObservedRunningTime="2026-03-08 05:30:01.576950424 +0000 UTC m=+228.494599268" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.608435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4"] Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.611282 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"db7dece4-2341-4131-bf36-c20321eb8900","Type":"ContainerDied","Data":"2f7020cb876cf867e9f7a5ef4922b3a68abd2b696c39df542b77a5ae7a7606e6"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.611382 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f7020cb876cf867e9f7a5ef4922b3a68abd2b696c39df542b77a5ae7a7606e6" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.611526 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.672014 4717 generic.go:334] "Generic (PLEG): container finished" podID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerID="efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9" exitCode=0 Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.673272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerDied","Data":"efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9"} Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.841236 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da83d24b-fb47-44b9-a05e-228eabe397cf" path="/var/lib/kubelet/pods/da83d24b-fb47-44b9-a05e-228eabe397cf/volumes" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.953071 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:01 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:01 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:01 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.953153 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:01 crc kubenswrapper[4717]: I0308 05:30:01.965001 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 05:30:02 crc kubenswrapper[4717]: W0308 05:30:02.070626 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod166e54f3_4d7c_4ba9_b04a_c03e3df27379.slice/crio-3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f WatchSource:0}: Error finding container 3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f: Status 404 returned error can't find the container with id 3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.712155 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549130-sbndg" event={"ID":"662f5bb0-c453-44f7-944a-5e39e1a580e9","Type":"ContainerStarted","Data":"bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe"} Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.719718 4717 generic.go:334] "Generic (PLEG): container finished" podID="1e14e94f-4d73-4184-837c-9f7ae6e57b20" containerID="278121bbcd6a10c1b105beaafc10b9c8eae6fef1ca448bfa6dd5efac355dbf07" exitCode=0 Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.720116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" event={"ID":"1e14e94f-4d73-4184-837c-9f7ae6e57b20","Type":"ContainerDied","Data":"278121bbcd6a10c1b105beaafc10b9c8eae6fef1ca448bfa6dd5efac355dbf07"} Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.720183 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" event={"ID":"1e14e94f-4d73-4184-837c-9f7ae6e57b20","Type":"ContainerStarted","Data":"b91cd19b43690f7feffa63ddc42ad8d2881abe38b4abca1911dacbdc616fea00"} Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.730644 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"166e54f3-4d7c-4ba9-b04a-c03e3df27379","Type":"ContainerStarted","Data":"3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f"} Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.967183 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:02 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:02 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:02 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:02 crc kubenswrapper[4717]: I0308 05:30:02.967269 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.036765 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lvb9t" Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.757329 4717 generic.go:334] "Generic (PLEG): container finished" podID="166e54f3-4d7c-4ba9-b04a-c03e3df27379" containerID="bc6ed84f1cb807e19198059d057ab99588d0ad2b6c898a784f884d3ddf68e1a5" exitCode=0 Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.757444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"166e54f3-4d7c-4ba9-b04a-c03e3df27379","Type":"ContainerDied","Data":"bc6ed84f1cb807e19198059d057ab99588d0ad2b6c898a784f884d3ddf68e1a5"} Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.958213 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:03 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:03 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:03 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.958290 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:03 crc kubenswrapper[4717]: I0308 05:30:03.975378 4717 ???:1] "http: TLS handshake error from 192.168.126.11:35380: no serving certificate available for the kubelet" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.119548 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.119615 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.195009 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.259674 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume\") pod \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.259782 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume\") pod \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.259844 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gjw6\" (UniqueName: \"kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6\") pod \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\" (UID: \"1e14e94f-4d73-4184-837c-9f7ae6e57b20\") " Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.261254 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e14e94f-4d73-4184-837c-9f7ae6e57b20" (UID: "1e14e94f-4d73-4184-837c-9f7ae6e57b20"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.269626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1e14e94f-4d73-4184-837c-9f7ae6e57b20" (UID: "1e14e94f-4d73-4184-837c-9f7ae6e57b20"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.284794 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6" (OuterVolumeSpecName: "kube-api-access-4gjw6") pod "1e14e94f-4d73-4184-837c-9f7ae6e57b20" (UID: "1e14e94f-4d73-4184-837c-9f7ae6e57b20"). InnerVolumeSpecName "kube-api-access-4gjw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.361367 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e14e94f-4d73-4184-837c-9f7ae6e57b20-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.361859 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e14e94f-4d73-4184-837c-9f7ae6e57b20-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.361872 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gjw6\" (UniqueName: \"kubernetes.io/projected/1e14e94f-4d73-4184-837c-9f7ae6e57b20-kube-api-access-4gjw6\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.793288 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.793514 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4" event={"ID":"1e14e94f-4d73-4184-837c-9f7ae6e57b20","Type":"ContainerDied","Data":"b91cd19b43690f7feffa63ddc42ad8d2881abe38b4abca1911dacbdc616fea00"} Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.793552 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b91cd19b43690f7feffa63ddc42ad8d2881abe38b4abca1911dacbdc616fea00" Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.953670 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:04 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:04 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:04 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:04 crc kubenswrapper[4717]: I0308 05:30:04.953897 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.129782 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.176453 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir\") pod \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.176528 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access\") pod \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\" (UID: \"166e54f3-4d7c-4ba9-b04a-c03e3df27379\") " Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.178182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "166e54f3-4d7c-4ba9-b04a-c03e3df27379" (UID: "166e54f3-4d7c-4ba9-b04a-c03e3df27379"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.189620 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "166e54f3-4d7c-4ba9-b04a-c03e3df27379" (UID: "166e54f3-4d7c-4ba9-b04a-c03e3df27379"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.277883 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.277924 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/166e54f3-4d7c-4ba9-b04a-c03e3df27379-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.370169 4717 ???:1] "http: TLS handshake error from 192.168.126.11:35396: no serving certificate available for the kubelet" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.817024 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"166e54f3-4d7c-4ba9-b04a-c03e3df27379","Type":"ContainerDied","Data":"3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f"} Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.817071 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.817085 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a246a77094bb85a6ea749886927919d3adbf2946b62e3def61811960a14633f" Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.954367 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:05 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:05 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:05 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:05 crc kubenswrapper[4717]: I0308 05:30:05.954469 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:06 crc kubenswrapper[4717]: I0308 05:30:06.952448 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:06 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:06 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:06 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:06 crc kubenswrapper[4717]: I0308 05:30:06.952524 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:07 crc kubenswrapper[4717]: I0308 05:30:07.953509 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:07 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:07 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:07 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:07 crc kubenswrapper[4717]: I0308 05:30:07.954031 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:08 crc kubenswrapper[4717]: I0308 05:30:08.952563 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:08 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:08 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:08 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:08 crc kubenswrapper[4717]: I0308 05:30:08.952640 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:09 crc kubenswrapper[4717]: I0308 05:30:09.773437 4717 patch_prober.go:28] interesting pod/console-f9d7485db-zhsbw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 05:30:09 crc kubenswrapper[4717]: I0308 05:30:09.773514 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zhsbw" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 05:30:09 crc kubenswrapper[4717]: I0308 05:30:09.955146 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:09 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Mar 08 05:30:09 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:09 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:09 crc kubenswrapper[4717]: I0308 05:30:09.955234 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:10 crc kubenswrapper[4717]: I0308 05:30:10.557203 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hgxsr" Mar 08 05:30:10 crc kubenswrapper[4717]: I0308 05:30:10.952926 4717 patch_prober.go:28] interesting pod/router-default-5444994796-bdgwk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 05:30:10 crc kubenswrapper[4717]: [+]has-synced ok Mar 08 05:30:10 crc kubenswrapper[4717]: [+]process-running ok Mar 08 05:30:10 crc kubenswrapper[4717]: healthz check failed Mar 08 05:30:10 crc kubenswrapper[4717]: I0308 05:30:10.953014 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bdgwk" podUID="f8e73a19-e7c1-4504-8499-4566b10f2682" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 05:30:11 crc kubenswrapper[4717]: I0308 05:30:11.952507 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:30:11 crc kubenswrapper[4717]: I0308 05:30:11.955559 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-bdgwk" Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.046819 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.048376 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" containerName="controller-manager" containerID="cri-o://7589483e111792175409aed1c60a96a8d290b0ccfd5e80c8d6068b13288597c6" gracePeriod=30 Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.060803 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.061149 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerName="route-controller-manager" containerID="cri-o://6c058a4a5b87fb10fb9612ca53c9b072f8dc3555475001fdefdb835619e0db46" gracePeriod=30 Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.938095 4717 generic.go:334] "Generic (PLEG): container finished" podID="050a2941-72c2-4173-9305-55c6d38e6d78" containerID="7589483e111792175409aed1c60a96a8d290b0ccfd5e80c8d6068b13288597c6" exitCode=0 Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.938170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" event={"ID":"050a2941-72c2-4173-9305-55c6d38e6d78","Type":"ContainerDied","Data":"7589483e111792175409aed1c60a96a8d290b0ccfd5e80c8d6068b13288597c6"} Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.939351 4717 generic.go:334] "Generic (PLEG): container finished" podID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerID="6c058a4a5b87fb10fb9612ca53c9b072f8dc3555475001fdefdb835619e0db46" exitCode=0 Mar 08 05:30:17 crc kubenswrapper[4717]: I0308 05:30:17.939377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" event={"ID":"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf","Type":"ContainerDied","Data":"6c058a4a5b87fb10fb9612ca53c9b072f8dc3555475001fdefdb835619e0db46"} Mar 08 05:30:18 crc kubenswrapper[4717]: I0308 05:30:18.316819 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:30:19 crc kubenswrapper[4717]: I0308 05:30:19.794813 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:30:19 crc kubenswrapper[4717]: I0308 05:30:19.804352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:30:20 crc kubenswrapper[4717]: I0308 05:30:20.723593 4717 patch_prober.go:28] interesting pod/controller-manager-6f56df4dc7-t7mqb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:30:20 crc kubenswrapper[4717]: I0308 05:30:20.723706 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:30:20 crc kubenswrapper[4717]: I0308 05:30:20.738763 4717 patch_prober.go:28] interesting pod/route-controller-manager-8647d5c774-567mc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:30:20 crc kubenswrapper[4717]: I0308 05:30:20.738870 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:30:21 crc kubenswrapper[4717]: I0308 05:30:21.951058 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.219786 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.220242 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:30:23 crc kubenswrapper[4717]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 08 05:30:23 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrfjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29549128-gnphx_openshift-infra(72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 08 05:30:23 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.221468 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29549128-gnphx" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.249676 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.249917 4717 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 05:30:23 crc kubenswrapper[4717]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 08 05:30:23 crc kubenswrapper[4717]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kfjg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29549130-sbndg_openshift-infra(662f5bb0-c453-44f7-944a-5e39e1a580e9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 08 05:30:23 crc kubenswrapper[4717]: > logger="UnhandledError" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.252041 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29549130-sbndg" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.259984 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.261253 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.292854 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.293196 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e14e94f-4d73-4184-837c-9f7ae6e57b20" containerName="collect-profiles" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293212 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e14e94f-4d73-4184-837c-9f7ae6e57b20" containerName="collect-profiles" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.293228 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" containerName="controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293235 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" containerName="controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.293245 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166e54f3-4d7c-4ba9-b04a-c03e3df27379" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293254 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="166e54f3-4d7c-4ba9-b04a-c03e3df27379" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.293295 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7dece4-2341-4131-bf36-c20321eb8900" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293305 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7dece4-2341-4131-bf36-c20321eb8900" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: E0308 05:30:23.293316 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerName="route-controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293325 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerName="route-controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293474 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" containerName="controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293491 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7dece4-2341-4131-bf36-c20321eb8900" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293502 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="166e54f3-4d7c-4ba9-b04a-c03e3df27379" containerName="pruner" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293513 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e14e94f-4d73-4184-837c-9f7ae6e57b20" containerName="collect-profiles" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.293523 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" containerName="route-controller-manager" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.297022 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.319432 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.454861 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config\") pod \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.454934 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert\") pod \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert\") pod \"050a2941-72c2-4173-9305-55c6d38e6d78\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxp65\" (UniqueName: \"kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65\") pod \"050a2941-72c2-4173-9305-55c6d38e6d78\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455064 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjr5j\" (UniqueName: \"kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j\") pod \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455116 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles\") pod \"050a2941-72c2-4173-9305-55c6d38e6d78\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455139 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca\") pod \"050a2941-72c2-4173-9305-55c6d38e6d78\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455170 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca\") pod \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\" (UID: \"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config\") pod \"050a2941-72c2-4173-9305-55c6d38e6d78\" (UID: \"050a2941-72c2-4173-9305-55c6d38e6d78\") " Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455341 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455365 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.455405 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg8x\" (UniqueName: \"kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.457534 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config" (OuterVolumeSpecName: "config") pod "050a2941-72c2-4173-9305-55c6d38e6d78" (UID: "050a2941-72c2-4173-9305-55c6d38e6d78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.458311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "050a2941-72c2-4173-9305-55c6d38e6d78" (UID: "050a2941-72c2-4173-9305-55c6d38e6d78"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.458396 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config" (OuterVolumeSpecName: "config") pod "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" (UID: "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.458671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca" (OuterVolumeSpecName: "client-ca") pod "050a2941-72c2-4173-9305-55c6d38e6d78" (UID: "050a2941-72c2-4173-9305-55c6d38e6d78"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.458959 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" (UID: "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.463078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "050a2941-72c2-4173-9305-55c6d38e6d78" (UID: "050a2941-72c2-4173-9305-55c6d38e6d78"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.463109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" (UID: "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.463150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65" (OuterVolumeSpecName: "kube-api-access-lxp65") pod "050a2941-72c2-4173-9305-55c6d38e6d78" (UID: "050a2941-72c2-4173-9305-55c6d38e6d78"). InnerVolumeSpecName "kube-api-access-lxp65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.463139 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j" (OuterVolumeSpecName: "kube-api-access-kjr5j") pod "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" (UID: "5a1f4f30-31e2-403c-a5a3-c60ef90f79cf"). InnerVolumeSpecName "kube-api-access-kjr5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566381 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg8x\" (UniqueName: \"kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566481 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566529 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566542 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566554 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050a2941-72c2-4173-9305-55c6d38e6d78-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566591 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566604 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566616 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050a2941-72c2-4173-9305-55c6d38e6d78-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566628 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxp65\" (UniqueName: \"kubernetes.io/projected/050a2941-72c2-4173-9305-55c6d38e6d78-kube-api-access-lxp65\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.566643 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjr5j\" (UniqueName: \"kubernetes.io/projected/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf-kube-api-access-kjr5j\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.568282 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.568429 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.568661 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.570136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.583395 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg8x\" (UniqueName: \"kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x\") pod \"controller-manager-99b65ccfc-rhcqf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.645857 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.995363 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.995542 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc" event={"ID":"5a1f4f30-31e2-403c-a5a3-c60ef90f79cf","Type":"ContainerDied","Data":"7e30d83b8705c6cb9629cb0e46c8ca316e5a62921efcfecd91cdb6b6d59a65aa"} Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.996022 4717 scope.go:117] "RemoveContainer" containerID="6c058a4a5b87fb10fb9612ca53c9b072f8dc3555475001fdefdb835619e0db46" Mar 08 05:30:23 crc kubenswrapper[4717]: I0308 05:30:23.999218 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" Mar 08 05:30:24 crc kubenswrapper[4717]: I0308 05:30:24.000052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb" event={"ID":"050a2941-72c2-4173-9305-55c6d38e6d78","Type":"ContainerDied","Data":"7739b6a7f5315d7b1e8fc046491296a5e52c10b88a3f9f1ba70bb63ebc4ab1b9"} Mar 08 05:30:24 crc kubenswrapper[4717]: E0308 05:30:24.000383 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29549130-sbndg" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" Mar 08 05:30:24 crc kubenswrapper[4717]: E0308 05:30:24.000396 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29549128-gnphx" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" Mar 08 05:30:24 crc kubenswrapper[4717]: I0308 05:30:24.045002 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:30:24 crc kubenswrapper[4717]: I0308 05:30:24.048435 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8647d5c774-567mc"] Mar 08 05:30:24 crc kubenswrapper[4717]: I0308 05:30:24.062736 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:30:24 crc kubenswrapper[4717]: I0308 05:30:24.066871 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f56df4dc7-t7mqb"] Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.301582 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.304097 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.311530 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.311555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.311749 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.311978 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.312396 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.312757 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.317362 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.410422 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswbq\" (UniqueName: \"kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.411178 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.411243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.411314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.512352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.512460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.512546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswbq\" (UniqueName: \"kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.512621 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.514205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.516663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.525548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.532383 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswbq\" (UniqueName: \"kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq\") pod \"route-controller-manager-674d97cc78-xflbb\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.643752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.790516 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050a2941-72c2-4173-9305-55c6d38e6d78" path="/var/lib/kubelet/pods/050a2941-72c2-4173-9305-55c6d38e6d78/volumes" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.791350 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1f4f30-31e2-403c-a5a3-c60ef90f79cf" path="/var/lib/kubelet/pods/5a1f4f30-31e2-403c-a5a3-c60ef90f79cf/volumes" Mar 08 05:30:25 crc kubenswrapper[4717]: I0308 05:30:25.876388 4717 ???:1] "http: TLS handshake error from 192.168.126.11:47048: no serving certificate available for the kubelet" Mar 08 05:30:28 crc kubenswrapper[4717]: E0308 05:30:28.590635 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 05:30:28 crc kubenswrapper[4717]: E0308 05:30:28.592758 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n9zzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cdf65_openshift-marketplace(eb6549f4-05b3-4309-b7f4-3b34fe523413): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:28 crc kubenswrapper[4717]: E0308 05:30:28.594167 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cdf65" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" Mar 08 05:30:29 crc kubenswrapper[4717]: I0308 05:30:29.240648 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:30:30 crc kubenswrapper[4717]: I0308 05:30:30.027113 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mxcxh" Mar 08 05:30:30 crc kubenswrapper[4717]: E0308 05:30:30.587846 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cdf65" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" Mar 08 05:30:30 crc kubenswrapper[4717]: E0308 05:30:30.662564 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 05:30:30 crc kubenswrapper[4717]: E0308 05:30:30.663113 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qg4gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hgbcl_openshift-marketplace(d612266d-387c-4561-a50f-02cd3cced887): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:30 crc kubenswrapper[4717]: E0308 05:30:30.664920 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hgbcl" podUID="d612266d-387c-4561-a50f-02cd3cced887" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.472387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hgbcl" podUID="d612266d-387c-4561-a50f-02cd3cced887" Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.524626 4717 scope.go:117] "RemoveContainer" containerID="7589483e111792175409aed1c60a96a8d290b0ccfd5e80c8d6068b13288597c6" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.570363 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.570616 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhqmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lmkvf_openshift-marketplace(5961d211-7900-41ef-9915-d935e9cec42a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.571930 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lmkvf" podUID="5961d211-7900-41ef-9915-d935e9cec42a" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.606351 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.606598 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qprdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-n7sb6_openshift-marketplace(724eb749-c200-4929-9a63-f3384e410a6f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.607789 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-n7sb6" podUID="724eb749-c200-4929-9a63-f3384e410a6f" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.643785 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.644002 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hl2w7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-x7t8t_openshift-marketplace(2ce686db-32d9-41b7-80fa-124e094dc4e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.646218 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-x7t8t" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.675054 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.675255 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llcwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n69hm_openshift-marketplace(06f4ab9f-48eb-410c-8915-c47c5cff1650): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.676699 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n69hm" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.692856 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.693091 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j79qz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sqqgh_openshift-marketplace(64d82598-c4ba-4e83-8810-c4b9ad5b2f51): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 05:30:32 crc kubenswrapper[4717]: E0308 05:30:32.694286 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sqqgh" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.774476 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.822262 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.881134 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.882598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.885201 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.885428 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 05:30:32 crc kubenswrapper[4717]: I0308 05:30:32.893563 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.035641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.035763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.070098 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" event={"ID":"346c1269-0f97-43d6-8f52-31f96844a6cf","Type":"ContainerStarted","Data":"515567b6d534e7f20ed67f45651c3598099df04a6517ab1e1547e3480006824e"} Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.070674 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.071751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" event={"ID":"346c1269-0f97-43d6-8f52-31f96844a6cf","Type":"ContainerStarted","Data":"ae69f0bf9090673e2f3812b7d6170d86a83002e7c46a4e9abc03f76c3942f958"} Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.073075 4717 patch_prober.go:28] interesting pod/controller-manager-99b65ccfc-rhcqf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.073123 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.076160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerStarted","Data":"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a"} Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.080416 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" event={"ID":"11d3992f-f5d5-4f31-b711-663d14df5d42","Type":"ContainerStarted","Data":"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368"} Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.080446 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" event={"ID":"11d3992f-f5d5-4f31-b711-663d14df5d42","Type":"ContainerStarted","Data":"8882bee6ce432cf7a40a263b926dffb5143d1fc3bdab66a21013f4107f68d0b8"} Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.080462 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:33 crc kubenswrapper[4717]: E0308 05:30:33.083357 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sqqgh" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" Mar 08 05:30:33 crc kubenswrapper[4717]: E0308 05:30:33.083357 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n69hm" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" Mar 08 05:30:33 crc kubenswrapper[4717]: E0308 05:30:33.083477 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-x7t8t" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" Mar 08 05:30:33 crc kubenswrapper[4717]: E0308 05:30:33.083572 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lmkvf" podUID="5961d211-7900-41ef-9915-d935e9cec42a" Mar 08 05:30:33 crc kubenswrapper[4717]: E0308 05:30:33.083639 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-n7sb6" podUID="724eb749-c200-4929-9a63-f3384e410a6f" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.092103 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" podStartSLOduration=16.092070167 podStartE2EDuration="16.092070167s" podCreationTimestamp="2026-03-08 05:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:33.088031822 +0000 UTC m=+260.005680676" watchObservedRunningTime="2026-03-08 05:30:33.092070167 +0000 UTC m=+260.009719011" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.137894 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.138458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.138131 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.178367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.236735 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" podStartSLOduration=16.236712236 podStartE2EDuration="16.236712236s" podCreationTimestamp="2026-03-08 05:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:33.235572927 +0000 UTC m=+260.153221771" watchObservedRunningTime="2026-03-08 05:30:33.236712236 +0000 UTC m=+260.154361080" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.296219 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.350214 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.643797 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 05:30:33 crc kubenswrapper[4717]: I0308 05:30:33.651347 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.085858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"07ce85cc-becd-42a2-ae0c-1a5723c911eb","Type":"ContainerStarted","Data":"974f43ff22dd9332eb76379a39234e0f02c6b9fd1edaa66cda36b00327abafac"} Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.086402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"07ce85cc-becd-42a2-ae0c-1a5723c911eb","Type":"ContainerStarted","Data":"c866402e8b8193a80678a1b36e25c87539743908ec4498d3e33c9a212ed1b9f1"} Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.088025 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad20120b-f363-485e-a130-c9e49e4605c4" containerID="86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a" exitCode=0 Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.088106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerDied","Data":"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a"} Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.104008 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.103981155 podStartE2EDuration="2.103981155s" podCreationTimestamp="2026-03-08 05:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:34.101734867 +0000 UTC m=+261.019383711" watchObservedRunningTime="2026-03-08 05:30:34.103981155 +0000 UTC m=+261.021629999" Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.120758 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:30:34 crc kubenswrapper[4717]: I0308 05:30:34.120863 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:30:35 crc kubenswrapper[4717]: I0308 05:30:35.098291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerStarted","Data":"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5"} Mar 08 05:30:35 crc kubenswrapper[4717]: I0308 05:30:35.100703 4717 generic.go:334] "Generic (PLEG): container finished" podID="07ce85cc-becd-42a2-ae0c-1a5723c911eb" containerID="974f43ff22dd9332eb76379a39234e0f02c6b9fd1edaa66cda36b00327abafac" exitCode=0 Mar 08 05:30:35 crc kubenswrapper[4717]: I0308 05:30:35.101195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"07ce85cc-becd-42a2-ae0c-1a5723c911eb","Type":"ContainerDied","Data":"974f43ff22dd9332eb76379a39234e0f02c6b9fd1edaa66cda36b00327abafac"} Mar 08 05:30:35 crc kubenswrapper[4717]: I0308 05:30:35.120749 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w672l" podStartSLOduration=3.107784342 podStartE2EDuration="36.120715231s" podCreationTimestamp="2026-03-08 05:29:59 +0000 UTC" firstStartedPulling="2026-03-08 05:30:01.542776196 +0000 UTC m=+228.460425040" lastFinishedPulling="2026-03-08 05:30:34.555707085 +0000 UTC m=+261.473355929" observedRunningTime="2026-03-08 05:30:35.119277493 +0000 UTC m=+262.036926337" watchObservedRunningTime="2026-03-08 05:30:35.120715231 +0000 UTC m=+262.038364105" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.110368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549130-sbndg" event={"ID":"662f5bb0-c453-44f7-944a-5e39e1a580e9","Type":"ContainerStarted","Data":"8bb5058eeb2c4feb1b3b817f8665954f29c86c298eeadd15978e822edc7e039b"} Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.130796 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549130-sbndg" podStartSLOduration=2.0197967 podStartE2EDuration="36.130768261s" podCreationTimestamp="2026-03-08 05:30:00 +0000 UTC" firstStartedPulling="2026-03-08 05:30:01.635783413 +0000 UTC m=+228.553432257" lastFinishedPulling="2026-03-08 05:30:35.746754974 +0000 UTC m=+262.664403818" observedRunningTime="2026-03-08 05:30:36.130692109 +0000 UTC m=+263.048340953" watchObservedRunningTime="2026-03-08 05:30:36.130768261 +0000 UTC m=+263.048417105" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.486571 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.556202 4717 csr.go:261] certificate signing request csr-xswhd is approved, waiting to be issued Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.561356 4717 csr.go:257] certificate signing request csr-xswhd is issued Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.618252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir\") pod \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.618364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access\") pod \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\" (UID: \"07ce85cc-becd-42a2-ae0c-1a5723c911eb\") " Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.618505 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "07ce85cc-becd-42a2-ae0c-1a5723c911eb" (UID: "07ce85cc-becd-42a2-ae0c-1a5723c911eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.619345 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.625985 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "07ce85cc-becd-42a2-ae0c-1a5723c911eb" (UID: "07ce85cc-becd-42a2-ae0c-1a5723c911eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.720934 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ce85cc-becd-42a2-ae0c-1a5723c911eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.980551 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:36 crc kubenswrapper[4717]: I0308 05:30:36.980849 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerName="controller-manager" containerID="cri-o://515567b6d534e7f20ed67f45651c3598099df04a6517ab1e1547e3480006824e" gracePeriod=30 Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.077895 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.078321 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" podUID="11d3992f-f5d5-4f31-b711-663d14df5d42" containerName="route-controller-manager" containerID="cri-o://1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368" gracePeriod=30 Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.122132 4717 generic.go:334] "Generic (PLEG): container finished" podID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerID="515567b6d534e7f20ed67f45651c3598099df04a6517ab1e1547e3480006824e" exitCode=0 Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.122599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" event={"ID":"346c1269-0f97-43d6-8f52-31f96844a6cf","Type":"ContainerDied","Data":"515567b6d534e7f20ed67f45651c3598099df04a6517ab1e1547e3480006824e"} Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.125935 4717 generic.go:334] "Generic (PLEG): container finished" podID="662f5bb0-c453-44f7-944a-5e39e1a580e9" containerID="8bb5058eeb2c4feb1b3b817f8665954f29c86c298eeadd15978e822edc7e039b" exitCode=0 Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.126150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549130-sbndg" event={"ID":"662f5bb0-c453-44f7-944a-5e39e1a580e9","Type":"ContainerDied","Data":"8bb5058eeb2c4feb1b3b817f8665954f29c86c298eeadd15978e822edc7e039b"} Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.129158 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549128-gnphx" event={"ID":"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2","Type":"ContainerStarted","Data":"6843ae263f26df42b918308af5e9e8896f253a409750ad92d42291f35ed5f955"} Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.143340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"07ce85cc-becd-42a2-ae0c-1a5723c911eb","Type":"ContainerDied","Data":"c866402e8b8193a80678a1b36e25c87539743908ec4498d3e33c9a212ed1b9f1"} Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.143367 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.143391 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c866402e8b8193a80678a1b36e25c87539743908ec4498d3e33c9a212ed1b9f1" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.158454 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549128-gnphx" podStartSLOduration=113.595107135 podStartE2EDuration="2m37.158424526s" podCreationTimestamp="2026-03-08 05:28:00 +0000 UTC" firstStartedPulling="2026-03-08 05:29:53.10644691 +0000 UTC m=+220.024095754" lastFinishedPulling="2026-03-08 05:30:36.669764301 +0000 UTC m=+263.587413145" observedRunningTime="2026-03-08 05:30:37.157139022 +0000 UTC m=+264.074787866" watchObservedRunningTime="2026-03-08 05:30:37.158424526 +0000 UTC m=+264.076073360" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.537811 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.564086 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-06 06:07:56.984209401 +0000 UTC Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.564329 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6552h37m19.419884001s for next certificate rotation Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.633635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca\") pod \"346c1269-0f97-43d6-8f52-31f96844a6cf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.633699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config\") pod \"346c1269-0f97-43d6-8f52-31f96844a6cf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.633744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg8x\" (UniqueName: \"kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x\") pod \"346c1269-0f97-43d6-8f52-31f96844a6cf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.633845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles\") pod \"346c1269-0f97-43d6-8f52-31f96844a6cf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.633874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert\") pod \"346c1269-0f97-43d6-8f52-31f96844a6cf\" (UID: \"346c1269-0f97-43d6-8f52-31f96844a6cf\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.634850 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca" (OuterVolumeSpecName: "client-ca") pod "346c1269-0f97-43d6-8f52-31f96844a6cf" (UID: "346c1269-0f97-43d6-8f52-31f96844a6cf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.634884 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "346c1269-0f97-43d6-8f52-31f96844a6cf" (UID: "346c1269-0f97-43d6-8f52-31f96844a6cf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.635375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config" (OuterVolumeSpecName: "config") pod "346c1269-0f97-43d6-8f52-31f96844a6cf" (UID: "346c1269-0f97-43d6-8f52-31f96844a6cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.640526 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "346c1269-0f97-43d6-8f52-31f96844a6cf" (UID: "346c1269-0f97-43d6-8f52-31f96844a6cf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.640724 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x" (OuterVolumeSpecName: "kube-api-access-gpg8x") pod "346c1269-0f97-43d6-8f52-31f96844a6cf" (UID: "346c1269-0f97-43d6-8f52-31f96844a6cf"). InnerVolumeSpecName "kube-api-access-gpg8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.645899 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.737358 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.737412 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.737426 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg8x\" (UniqueName: \"kubernetes.io/projected/346c1269-0f97-43d6-8f52-31f96844a6cf-kube-api-access-gpg8x\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.737441 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/346c1269-0f97-43d6-8f52-31f96844a6cf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.737453 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/346c1269-0f97-43d6-8f52-31f96844a6cf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.841201 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswbq\" (UniqueName: \"kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq\") pod \"11d3992f-f5d5-4f31-b711-663d14df5d42\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.841341 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert\") pod \"11d3992f-f5d5-4f31-b711-663d14df5d42\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.841485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca\") pod \"11d3992f-f5d5-4f31-b711-663d14df5d42\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.841603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config\") pod \"11d3992f-f5d5-4f31-b711-663d14df5d42\" (UID: \"11d3992f-f5d5-4f31-b711-663d14df5d42\") " Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.843033 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca" (OuterVolumeSpecName: "client-ca") pod "11d3992f-f5d5-4f31-b711-663d14df5d42" (UID: "11d3992f-f5d5-4f31-b711-663d14df5d42"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.843291 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config" (OuterVolumeSpecName: "config") pod "11d3992f-f5d5-4f31-b711-663d14df5d42" (UID: "11d3992f-f5d5-4f31-b711-663d14df5d42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.846361 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq" (OuterVolumeSpecName: "kube-api-access-vswbq") pod "11d3992f-f5d5-4f31-b711-663d14df5d42" (UID: "11d3992f-f5d5-4f31-b711-663d14df5d42"). InnerVolumeSpecName "kube-api-access-vswbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.851658 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11d3992f-f5d5-4f31-b711-663d14df5d42" (UID: "11d3992f-f5d5-4f31-b711-663d14df5d42"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.944776 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.945290 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswbq\" (UniqueName: \"kubernetes.io/projected/11d3992f-f5d5-4f31-b711-663d14df5d42-kube-api-access-vswbq\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.945449 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11d3992f-f5d5-4f31-b711-663d14df5d42-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:37 crc kubenswrapper[4717]: I0308 05:30:37.945580 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11d3992f-f5d5-4f31-b711-663d14df5d42-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.153759 4717 generic.go:334] "Generic (PLEG): container finished" podID="11d3992f-f5d5-4f31-b711-663d14df5d42" containerID="1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368" exitCode=0 Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.155091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" event={"ID":"11d3992f-f5d5-4f31-b711-663d14df5d42","Type":"ContainerDied","Data":"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368"} Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.155362 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" event={"ID":"11d3992f-f5d5-4f31-b711-663d14df5d42","Type":"ContainerDied","Data":"8882bee6ce432cf7a40a263b926dffb5143d1fc3bdab66a21013f4107f68d0b8"} Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.155388 4717 scope.go:117] "RemoveContainer" containerID="1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.155754 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.166083 4717 generic.go:334] "Generic (PLEG): container finished" podID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" containerID="6843ae263f26df42b918308af5e9e8896f253a409750ad92d42291f35ed5f955" exitCode=0 Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.166190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549128-gnphx" event={"ID":"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2","Type":"ContainerDied","Data":"6843ae263f26df42b918308af5e9e8896f253a409750ad92d42291f35ed5f955"} Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.171055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" event={"ID":"346c1269-0f97-43d6-8f52-31f96844a6cf","Type":"ContainerDied","Data":"ae69f0bf9090673e2f3812b7d6170d86a83002e7c46a4e9abc03f76c3942f958"} Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.171254 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99b65ccfc-rhcqf" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.209657 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.218179 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-99b65ccfc-rhcqf"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.223491 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.225976 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-674d97cc78-xflbb"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.304773 4717 scope.go:117] "RemoveContainer" containerID="1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368" Mar 08 05:30:38 crc kubenswrapper[4717]: E0308 05:30:38.307109 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368\": container with ID starting with 1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368 not found: ID does not exist" containerID="1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.307165 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368"} err="failed to get container status \"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368\": rpc error: code = NotFound desc = could not find container \"1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368\": container with ID starting with 1b7d7008c50775f27738c7aed9ad63e281dcc51c4e21db8e95e10968ada57368 not found: ID does not exist" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.307208 4717 scope.go:117] "RemoveContainer" containerID="515567b6d534e7f20ed67f45651c3598099df04a6517ab1e1547e3480006824e" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330220 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:38 crc kubenswrapper[4717]: E0308 05:30:38.330572 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerName="controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330594 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerName="controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: E0308 05:30:38.330612 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d3992f-f5d5-4f31-b711-663d14df5d42" containerName="route-controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330621 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d3992f-f5d5-4f31-b711-663d14df5d42" containerName="route-controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: E0308 05:30:38.330653 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ce85cc-becd-42a2-ae0c-1a5723c911eb" containerName="pruner" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330660 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ce85cc-becd-42a2-ae0c-1a5723c911eb" containerName="pruner" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330822 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ce85cc-becd-42a2-ae0c-1a5723c911eb" containerName="pruner" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330839 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" containerName="controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.330849 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d3992f-f5d5-4f31-b711-663d14df5d42" containerName="route-controller-manager" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.331349 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.335549 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.343430 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.345499 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.347574 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.347846 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.348936 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350590 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4hl\" (UniqueName: \"kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350669 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350715 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350733 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.350808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.351621 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.351637 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.351756 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.354876 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.355251 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.355418 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.355605 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.355805 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.356020 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.356460 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.451473 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.451531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.451596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.451623 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4hl\" (UniqueName: \"kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.451949 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.452710 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.452826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.453375 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.453647 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.462788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.476202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4hl\" (UniqueName: \"kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl\") pod \"controller-manager-75568c6556-hrqpp\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.553554 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.553646 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6kl\" (UniqueName: \"kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.554161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.554354 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.564901 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 05:11:12.20348838 +0000 UTC Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.564943 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6479h40m33.638548988s for next certificate rotation Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.656037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfjg4\" (UniqueName: \"kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4\") pod \"662f5bb0-c453-44f7-944a-5e39e1a580e9\" (UID: \"662f5bb0-c453-44f7-944a-5e39e1a580e9\") " Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.656412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.656465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.656489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.656512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6kl\" (UniqueName: \"kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.657513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.657897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.660292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4" (OuterVolumeSpecName: "kube-api-access-kfjg4") pod "662f5bb0-c453-44f7-944a-5e39e1a580e9" (UID: "662f5bb0-c453-44f7-944a-5e39e1a580e9"). InnerVolumeSpecName "kube-api-access-kfjg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.662554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.672384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.678517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6kl\" (UniqueName: \"kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl\") pod \"route-controller-manager-5668d59bdc-qdbb6\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.681745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.757053 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfjg4\" (UniqueName: \"kubernetes.io/projected/662f5bb0-c453-44f7-944a-5e39e1a580e9-kube-api-access-kfjg4\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:38 crc kubenswrapper[4717]: I0308 05:30:38.945508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:38 crc kubenswrapper[4717]: W0308 05:30:38.955622 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedbe67e8_c426_4c3e_8f67_4c701df8a860.slice/crio-b72c13aac9ad2ad622b63ecfdbdf916e319d4c1c6c3832a3dc10e6bcc6c334ad WatchSource:0}: Error finding container b72c13aac9ad2ad622b63ecfdbdf916e319d4c1c6c3832a3dc10e6bcc6c334ad: Status 404 returned error can't find the container with id b72c13aac9ad2ad622b63ecfdbdf916e319d4c1c6c3832a3dc10e6bcc6c334ad Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.102225 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:39 crc kubenswrapper[4717]: W0308 05:30:39.113524 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1087ff58_5435_4a42_a022_7950d5c5a5a6.slice/crio-5d4a231371fe7d06309e819494a3ceb56adf31e118653a19734e6595e99a29af WatchSource:0}: Error finding container 5d4a231371fe7d06309e819494a3ceb56adf31e118653a19734e6595e99a29af: Status 404 returned error can't find the container with id 5d4a231371fe7d06309e819494a3ceb56adf31e118653a19734e6595e99a29af Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.184466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549130-sbndg" event={"ID":"662f5bb0-c453-44f7-944a-5e39e1a580e9","Type":"ContainerDied","Data":"bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe"} Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.184508 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549130-sbndg" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.184523 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdd093d50642bad05005a1be85e47a793004a157faf24dbe7746764e88a900fe" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.186088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" event={"ID":"1087ff58-5435-4a42-a022-7950d5c5a5a6","Type":"ContainerStarted","Data":"5d4a231371fe7d06309e819494a3ceb56adf31e118653a19734e6595e99a29af"} Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.187285 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" event={"ID":"edbe67e8-c426-4c3e-8f67-4c701df8a860","Type":"ContainerStarted","Data":"b72c13aac9ad2ad622b63ecfdbdf916e319d4c1c6c3832a3dc10e6bcc6c334ad"} Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.421893 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.472533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrfjr\" (UniqueName: \"kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr\") pod \"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2\" (UID: \"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2\") " Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.483335 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr" (OuterVolumeSpecName: "kube-api-access-nrfjr") pod "72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" (UID: "72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2"). InnerVolumeSpecName "kube-api-access-nrfjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.574809 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrfjr\" (UniqueName: \"kubernetes.io/projected/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2-kube-api-access-nrfjr\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.815467 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d3992f-f5d5-4f31-b711-663d14df5d42" path="/var/lib/kubelet/pods/11d3992f-f5d5-4f31-b711-663d14df5d42/volumes" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.816328 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346c1269-0f97-43d6-8f52-31f96844a6cf" path="/var/lib/kubelet/pods/346c1269-0f97-43d6-8f52-31f96844a6cf/volumes" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.850441 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:39 crc kubenswrapper[4717]: I0308 05:30:39.850512 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.198104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549128-gnphx" event={"ID":"72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2","Type":"ContainerDied","Data":"6352e5925ab5f2cdf78caa0c4beb3e78229b476ba5695da0e33310965d3e04f9"} Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.198138 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549128-gnphx" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.198161 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6352e5925ab5f2cdf78caa0c4beb3e78229b476ba5695da0e33310965d3e04f9" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.199512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" event={"ID":"1087ff58-5435-4a42-a022-7950d5c5a5a6","Type":"ContainerStarted","Data":"44fc040f04921b65bed6a3d9c98aa06cfc6069766ad056042b53f14a97da7a91"} Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.199851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.201735 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" event={"ID":"edbe67e8-c426-4c3e-8f67-4c701df8a860","Type":"ContainerStarted","Data":"ac1b35352246e1dcab0f473c4fbf4870b25ca83a3f5f438464543b03af6fafe2"} Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.202561 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.205544 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.210817 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.245885 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" podStartSLOduration=3.245859677 podStartE2EDuration="3.245859677s" podCreationTimestamp="2026-03-08 05:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:40.230012329 +0000 UTC m=+267.147661183" watchObservedRunningTime="2026-03-08 05:30:40.245859677 +0000 UTC m=+267.163508521" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.273820 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" podStartSLOduration=3.273786262 podStartE2EDuration="3.273786262s" podCreationTimestamp="2026-03-08 05:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:40.272654152 +0000 UTC m=+267.190302996" watchObservedRunningTime="2026-03-08 05:30:40.273786262 +0000 UTC m=+267.191435106" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.676568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 05:30:40 crc kubenswrapper[4717]: E0308 05:30:40.676891 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.676909 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: E0308 05:30:40.676931 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.676939 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.677073 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.677087 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" containerName="oc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.677588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.680999 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.681286 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.692643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.692738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.692812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.704906 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.794148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.794373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.794435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.794483 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.794434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.814108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access\") pod \"installer-9-crc\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:40 crc kubenswrapper[4717]: I0308 05:30:40.999075 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:30:41 crc kubenswrapper[4717]: I0308 05:30:41.389190 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w672l" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="registry-server" probeResult="failure" output=< Mar 08 05:30:41 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:30:41 crc kubenswrapper[4717]: > Mar 08 05:30:41 crc kubenswrapper[4717]: I0308 05:30:41.428166 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 05:30:41 crc kubenswrapper[4717]: W0308 05:30:41.438356 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb8b0fb22_162d_4b6b_a7f5_3fbeb4c71c84.slice/crio-3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5 WatchSource:0}: Error finding container 3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5: Status 404 returned error can't find the container with id 3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5 Mar 08 05:30:42 crc kubenswrapper[4717]: I0308 05:30:42.216547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84","Type":"ContainerStarted","Data":"59cd832ba158a57bd24080c46c76326c92c95d9dbf2ea262f8b7e79ff0b6bfc3"} Mar 08 05:30:42 crc kubenswrapper[4717]: I0308 05:30:42.217232 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84","Type":"ContainerStarted","Data":"3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5"} Mar 08 05:30:42 crc kubenswrapper[4717]: I0308 05:30:42.238800 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.238775712 podStartE2EDuration="2.238775712s" podCreationTimestamp="2026-03-08 05:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:30:42.238444393 +0000 UTC m=+269.156093267" watchObservedRunningTime="2026-03-08 05:30:42.238775712 +0000 UTC m=+269.156424546" Mar 08 05:30:44 crc kubenswrapper[4717]: I0308 05:30:44.272638 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerID="585a984042a44528285f33a5892346340b8be4dbe691652be068e68a72eebd86" exitCode=0 Mar 08 05:30:44 crc kubenswrapper[4717]: I0308 05:30:44.272803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerDied","Data":"585a984042a44528285f33a5892346340b8be4dbe691652be068e68a72eebd86"} Mar 08 05:30:46 crc kubenswrapper[4717]: I0308 05:30:46.296421 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerStarted","Data":"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130"} Mar 08 05:30:46 crc kubenswrapper[4717]: I0308 05:30:46.302663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerStarted","Data":"3ace2bbc842356ba5f91bd1e3c708f66b403d43c6e5c692dc067cae8033fdb39"} Mar 08 05:30:46 crc kubenswrapper[4717]: I0308 05:30:46.307277 4717 generic.go:334] "Generic (PLEG): container finished" podID="d612266d-387c-4561-a50f-02cd3cced887" containerID="df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71" exitCode=0 Mar 08 05:30:46 crc kubenswrapper[4717]: I0308 05:30:46.307338 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerDied","Data":"df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71"} Mar 08 05:30:46 crc kubenswrapper[4717]: I0308 05:30:46.373134 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cdf65" podStartSLOduration=3.100993513 podStartE2EDuration="48.373103125s" podCreationTimestamp="2026-03-08 05:29:58 +0000 UTC" firstStartedPulling="2026-03-08 05:30:00.434560015 +0000 UTC m=+227.352208869" lastFinishedPulling="2026-03-08 05:30:45.706669637 +0000 UTC m=+272.624318481" observedRunningTime="2026-03-08 05:30:46.368483613 +0000 UTC m=+273.286132497" watchObservedRunningTime="2026-03-08 05:30:46.373103125 +0000 UTC m=+273.290751969" Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.319350 4717 generic.go:334] "Generic (PLEG): container finished" podID="5961d211-7900-41ef-9915-d935e9cec42a" containerID="adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130" exitCode=0 Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.319532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerDied","Data":"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130"} Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.319970 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerStarted","Data":"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf"} Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.325297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerStarted","Data":"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d"} Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.329412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerStarted","Data":"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55"} Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.340259 4717 generic.go:334] "Generic (PLEG): container finished" podID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerID="75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6" exitCode=0 Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.340330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerDied","Data":"75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6"} Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.365146 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lmkvf" podStartSLOduration=2.950382146 podStartE2EDuration="51.365091589s" podCreationTimestamp="2026-03-08 05:29:56 +0000 UTC" firstStartedPulling="2026-03-08 05:29:58.331335126 +0000 UTC m=+225.248983970" lastFinishedPulling="2026-03-08 05:30:46.746044569 +0000 UTC m=+273.663693413" observedRunningTime="2026-03-08 05:30:47.358400102 +0000 UTC m=+274.276048966" watchObservedRunningTime="2026-03-08 05:30:47.365091589 +0000 UTC m=+274.282740443" Mar 08 05:30:47 crc kubenswrapper[4717]: I0308 05:30:47.381191 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hgbcl" podStartSLOduration=3.9651015530000002 podStartE2EDuration="52.381159232s" podCreationTimestamp="2026-03-08 05:29:55 +0000 UTC" firstStartedPulling="2026-03-08 05:29:58.302793984 +0000 UTC m=+225.220442828" lastFinishedPulling="2026-03-08 05:30:46.718851663 +0000 UTC m=+273.636500507" observedRunningTime="2026-03-08 05:30:47.381050059 +0000 UTC m=+274.298698913" watchObservedRunningTime="2026-03-08 05:30:47.381159232 +0000 UTC m=+274.298808076" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.366101 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerStarted","Data":"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8"} Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.370157 4717 generic.go:334] "Generic (PLEG): container finished" podID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerID="96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0" exitCode=0 Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.370216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerDied","Data":"96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0"} Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.373345 4717 generic.go:334] "Generic (PLEG): container finished" podID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerID="1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55" exitCode=0 Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.373389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerDied","Data":"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55"} Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.417357 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x7t8t" podStartSLOduration=1.999466606 podStartE2EDuration="50.417318881s" podCreationTimestamp="2026-03-08 05:29:58 +0000 UTC" firstStartedPulling="2026-03-08 05:29:59.362941336 +0000 UTC m=+226.280590180" lastFinishedPulling="2026-03-08 05:30:47.780793611 +0000 UTC m=+274.698442455" observedRunningTime="2026-03-08 05:30:48.391596073 +0000 UTC m=+275.309244927" watchObservedRunningTime="2026-03-08 05:30:48.417318881 +0000 UTC m=+275.334967765" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.588043 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.589117 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.901927 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.902010 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:30:48 crc kubenswrapper[4717]: I0308 05:30:48.949254 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.386611 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerStarted","Data":"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6"} Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.391817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerStarted","Data":"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e"} Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.395074 4717 generic.go:334] "Generic (PLEG): container finished" podID="724eb749-c200-4929-9a63-f3384e410a6f" containerID="aa4de12c4a9e874febb7019781a42ed7046d898a4eb05229908a04ca2e2ec089" exitCode=0 Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.395181 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerDied","Data":"aa4de12c4a9e874febb7019781a42ed7046d898a4eb05229908a04ca2e2ec089"} Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.429141 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sqqgh" podStartSLOduration=4.050025026 podStartE2EDuration="53.429119118s" podCreationTimestamp="2026-03-08 05:29:56 +0000 UTC" firstStartedPulling="2026-03-08 05:29:59.394963038 +0000 UTC m=+226.312611882" lastFinishedPulling="2026-03-08 05:30:48.77405713 +0000 UTC m=+275.691705974" observedRunningTime="2026-03-08 05:30:49.424339132 +0000 UTC m=+276.341987986" watchObservedRunningTime="2026-03-08 05:30:49.429119118 +0000 UTC m=+276.346767972" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.468012 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.468119 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.468394 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n69hm" podStartSLOduration=3.40576716 podStartE2EDuration="50.468363112s" podCreationTimestamp="2026-03-08 05:29:59 +0000 UTC" firstStartedPulling="2026-03-08 05:30:01.687589119 +0000 UTC m=+228.605237963" lastFinishedPulling="2026-03-08 05:30:48.750185071 +0000 UTC m=+275.667833915" observedRunningTime="2026-03-08 05:30:49.464421568 +0000 UTC m=+276.382070422" watchObservedRunningTime="2026-03-08 05:30:49.468363112 +0000 UTC m=+276.386011966" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.648104 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-x7t8t" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="registry-server" probeResult="failure" output=< Mar 08 05:30:49 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:30:49 crc kubenswrapper[4717]: > Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.902162 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:49 crc kubenswrapper[4717]: I0308 05:30:49.958929 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:50 crc kubenswrapper[4717]: I0308 05:30:50.405165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerStarted","Data":"a07ae705cad49c7fb94fb794615bd28936bfa2e46656ac76b94da3a1af96a39f"} Mar 08 05:30:50 crc kubenswrapper[4717]: I0308 05:30:50.435205 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7sb6" podStartSLOduration=2.927478247 podStartE2EDuration="54.435181663s" podCreationTimestamp="2026-03-08 05:29:56 +0000 UTC" firstStartedPulling="2026-03-08 05:29:58.315591287 +0000 UTC m=+225.233240131" lastFinishedPulling="2026-03-08 05:30:49.823294663 +0000 UTC m=+276.740943547" observedRunningTime="2026-03-08 05:30:50.431882076 +0000 UTC m=+277.349530940" watchObservedRunningTime="2026-03-08 05:30:50.435181663 +0000 UTC m=+277.352830507" Mar 08 05:30:50 crc kubenswrapper[4717]: I0308 05:30:50.519197 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n69hm" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="registry-server" probeResult="failure" output=< Mar 08 05:30:50 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:30:50 crc kubenswrapper[4717]: > Mar 08 05:30:51 crc kubenswrapper[4717]: I0308 05:30:51.629956 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:30:51 crc kubenswrapper[4717]: I0308 05:30:51.630661 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w672l" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="registry-server" containerID="cri-o://64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5" gracePeriod=2 Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.250343 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.393394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities\") pod \"ad20120b-f363-485e-a130-c9e49e4605c4\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.393501 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjbx5\" (UniqueName: \"kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5\") pod \"ad20120b-f363-485e-a130-c9e49e4605c4\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.393566 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content\") pod \"ad20120b-f363-485e-a130-c9e49e4605c4\" (UID: \"ad20120b-f363-485e-a130-c9e49e4605c4\") " Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.394857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities" (OuterVolumeSpecName: "utilities") pod "ad20120b-f363-485e-a130-c9e49e4605c4" (UID: "ad20120b-f363-485e-a130-c9e49e4605c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.401578 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5" (OuterVolumeSpecName: "kube-api-access-cjbx5") pod "ad20120b-f363-485e-a130-c9e49e4605c4" (UID: "ad20120b-f363-485e-a130-c9e49e4605c4"). InnerVolumeSpecName "kube-api-access-cjbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.423254 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad20120b-f363-485e-a130-c9e49e4605c4" containerID="64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5" exitCode=0 Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.423354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerDied","Data":"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5"} Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.423435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w672l" event={"ID":"ad20120b-f363-485e-a130-c9e49e4605c4","Type":"ContainerDied","Data":"1689181308d709a54903c6962c992831090073f655a45e37ce92827318fb781f"} Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.423459 4717 scope.go:117] "RemoveContainer" containerID="64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.423375 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w672l" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.450873 4717 scope.go:117] "RemoveContainer" containerID="86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.472044 4717 scope.go:117] "RemoveContainer" containerID="356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.495030 4717 scope.go:117] "RemoveContainer" containerID="64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.496269 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.496298 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjbx5\" (UniqueName: \"kubernetes.io/projected/ad20120b-f363-485e-a130-c9e49e4605c4-kube-api-access-cjbx5\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:52 crc kubenswrapper[4717]: E0308 05:30:52.496416 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5\": container with ID starting with 64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5 not found: ID does not exist" containerID="64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.496455 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5"} err="failed to get container status \"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5\": rpc error: code = NotFound desc = could not find container \"64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5\": container with ID starting with 64b461b3413671b1fa05b64c606837d86c0072e3b2f20f8c9229cfb2ba7f2df5 not found: ID does not exist" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.496494 4717 scope.go:117] "RemoveContainer" containerID="86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a" Mar 08 05:30:52 crc kubenswrapper[4717]: E0308 05:30:52.497170 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a\": container with ID starting with 86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a not found: ID does not exist" containerID="86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.497241 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a"} err="failed to get container status \"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a\": rpc error: code = NotFound desc = could not find container \"86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a\": container with ID starting with 86c81dd047094cdf513f3398ce97effffa3df7d5e12969525b52d9009429e88a not found: ID does not exist" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.497293 4717 scope.go:117] "RemoveContainer" containerID="356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633" Mar 08 05:30:52 crc kubenswrapper[4717]: E0308 05:30:52.497728 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633\": container with ID starting with 356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633 not found: ID does not exist" containerID="356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.497765 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633"} err="failed to get container status \"356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633\": rpc error: code = NotFound desc = could not find container \"356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633\": container with ID starting with 356b1c6809a0f9cf68db3f51bd33ca4a19026ba5de24d757c1ecf5b02dcde633 not found: ID does not exist" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.552109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad20120b-f363-485e-a130-c9e49e4605c4" (UID: "ad20120b-f363-485e-a130-c9e49e4605c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.598123 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad20120b-f363-485e-a130-c9e49e4605c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.763429 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:30:52 crc kubenswrapper[4717]: I0308 05:30:52.777077 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w672l"] Mar 08 05:30:53 crc kubenswrapper[4717]: I0308 05:30:53.796957 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" path="/var/lib/kubelet/pods/ad20120b-f363-485e-a130-c9e49e4605c4/volumes" Mar 08 05:30:54 crc kubenswrapper[4717]: I0308 05:30:54.272725 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerName="oauth-openshift" containerID="cri-o://dbe1e79d699afa14b45f81aba1596899e657267e534b6868ffe9b9baf8a4cb52" gracePeriod=15 Mar 08 05:30:54 crc kubenswrapper[4717]: I0308 05:30:54.446626 4717 generic.go:334] "Generic (PLEG): container finished" podID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerID="dbe1e79d699afa14b45f81aba1596899e657267e534b6868ffe9b9baf8a4cb52" exitCode=0 Mar 08 05:30:54 crc kubenswrapper[4717]: I0308 05:30:54.446764 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" event={"ID":"f2de96d0-d47e-4240-832d-c9b1e1c882df","Type":"ContainerDied","Data":"dbe1e79d699afa14b45f81aba1596899e657267e534b6868ffe9b9baf8a4cb52"} Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.395148 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.462824 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" event={"ID":"f2de96d0-d47e-4240-832d-c9b1e1c882df","Type":"ContainerDied","Data":"031c898fb5340e6a3747754d7ec77f05c4c3b25d6922ccc1e00466c7413e58ae"} Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.462906 4717 scope.go:117] "RemoveContainer" containerID="dbe1e79d699afa14b45f81aba1596899e657267e534b6868ffe9b9baf8a4cb52" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.462907 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxg8l" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.546828 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzfr5\" (UniqueName: \"kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.546886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.546986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.547859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.547921 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.547985 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548692 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548762 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548794 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548847 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548872 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548907 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.548932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig\") pod \"f2de96d0-d47e-4240-832d-c9b1e1c882df\" (UID: \"f2de96d0-d47e-4240-832d-c9b1e1c882df\") " Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.549183 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.549654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.549747 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.550579 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.550838 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.554836 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.555288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.555799 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.556008 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5" (OuterVolumeSpecName: "kube-api-access-jzfr5") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "kube-api-access-jzfr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.556059 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.557311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.565088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.573255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.573772 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f2de96d0-d47e-4240-832d-c9b1e1c882df" (UID: "f2de96d0-d47e-4240-832d-c9b1e1c882df"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.650878 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651177 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651292 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651380 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651451 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzfr5\" (UniqueName: \"kubernetes.io/projected/f2de96d0-d47e-4240-832d-c9b1e1c882df-kube-api-access-jzfr5\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651515 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651575 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651633 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651712 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651790 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651852 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651909 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2de96d0-d47e-4240-832d-c9b1e1c882df-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.651971 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2de96d0-d47e-4240-832d-c9b1e1c882df-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.810530 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:30:55 crc kubenswrapper[4717]: I0308 05:30:55.816171 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxg8l"] Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.359428 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.359502 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.429767 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.517392 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.775220 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.775278 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.832859 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.996728 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:30:56 crc kubenswrapper[4717]: I0308 05:30:56.996823 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.000112 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.003453 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" podUID="1087ff58-5435-4a42-a022-7950d5c5a5a6" containerName="controller-manager" containerID="cri-o://44fc040f04921b65bed6a3d9c98aa06cfc6069766ad056042b53f14a97da7a91" gracePeriod=30 Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.047634 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.047962 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" podUID="edbe67e8-c426-4c3e-8f67-4c701df8a860" containerName="route-controller-manager" containerID="cri-o://ac1b35352246e1dcab0f473c4fbf4870b25ca83a3f5f438464543b03af6fafe2" gracePeriod=30 Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.060203 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.060822 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.094174 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.112103 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.485345 4717 generic.go:334] "Generic (PLEG): container finished" podID="1087ff58-5435-4a42-a022-7950d5c5a5a6" containerID="44fc040f04921b65bed6a3d9c98aa06cfc6069766ad056042b53f14a97da7a91" exitCode=0 Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.487522 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" event={"ID":"1087ff58-5435-4a42-a022-7950d5c5a5a6","Type":"ContainerDied","Data":"44fc040f04921b65bed6a3d9c98aa06cfc6069766ad056042b53f14a97da7a91"} Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.489726 4717 generic.go:334] "Generic (PLEG): container finished" podID="edbe67e8-c426-4c3e-8f67-4c701df8a860" containerID="ac1b35352246e1dcab0f473c4fbf4870b25ca83a3f5f438464543b03af6fafe2" exitCode=0 Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.489958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" event={"ID":"edbe67e8-c426-4c3e-8f67-4c701df8a860","Type":"ContainerDied","Data":"ac1b35352246e1dcab0f473c4fbf4870b25ca83a3f5f438464543b03af6fafe2"} Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.567835 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.568033 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.570549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:30:57 crc kubenswrapper[4717]: I0308 05:30:57.792307 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" path="/var/lib/kubelet/pods/f2de96d0-d47e-4240-832d-c9b1e1c882df/volumes" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.042469 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.094535 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:30:58 crc kubenswrapper[4717]: E0308 05:30:58.095165 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="extract-utilities" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095209 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="extract-utilities" Mar 08 05:30:58 crc kubenswrapper[4717]: E0308 05:30:58.095247 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerName="oauth-openshift" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095261 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerName="oauth-openshift" Mar 08 05:30:58 crc kubenswrapper[4717]: E0308 05:30:58.095276 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbe67e8-c426-4c3e-8f67-4c701df8a860" containerName="route-controller-manager" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095292 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbe67e8-c426-4c3e-8f67-4c701df8a860" containerName="route-controller-manager" Mar 08 05:30:58 crc kubenswrapper[4717]: E0308 05:30:58.095323 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="extract-content" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095337 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="extract-content" Mar 08 05:30:58 crc kubenswrapper[4717]: E0308 05:30:58.095359 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="registry-server" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095372 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="registry-server" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095549 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad20120b-f363-485e-a130-c9e49e4605c4" containerName="registry-server" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095582 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbe67e8-c426-4c3e-8f67-4c701df8a860" containerName="route-controller-manager" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.095602 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2de96d0-d47e-4240-832d-c9b1e1c882df" containerName="oauth-openshift" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.096266 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.107416 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.114845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca\") pod \"edbe67e8-c426-4c3e-8f67-4c701df8a860\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.114942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert\") pod \"edbe67e8-c426-4c3e-8f67-4c701df8a860\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115047 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z6kl\" (UniqueName: \"kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl\") pod \"edbe67e8-c426-4c3e-8f67-4c701df8a860\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115090 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config\") pod \"edbe67e8-c426-4c3e-8f67-4c701df8a860\" (UID: \"edbe67e8-c426-4c3e-8f67-4c701df8a860\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.115384 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.116759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca" (OuterVolumeSpecName: "client-ca") pod "edbe67e8-c426-4c3e-8f67-4c701df8a860" (UID: "edbe67e8-c426-4c3e-8f67-4c701df8a860"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.122611 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config" (OuterVolumeSpecName: "config") pod "edbe67e8-c426-4c3e-8f67-4c701df8a860" (UID: "edbe67e8-c426-4c3e-8f67-4c701df8a860"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.126315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl" (OuterVolumeSpecName: "kube-api-access-7z6kl") pod "edbe67e8-c426-4c3e-8f67-4c701df8a860" (UID: "edbe67e8-c426-4c3e-8f67-4c701df8a860"). InnerVolumeSpecName "kube-api-access-7z6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.136940 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edbe67e8-c426-4c3e-8f67-4c701df8a860" (UID: "edbe67e8-c426-4c3e-8f67-4c701df8a860"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.194530 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.216641 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles\") pod \"1087ff58-5435-4a42-a022-7950d5c5a5a6\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.216801 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert\") pod \"1087ff58-5435-4a42-a022-7950d5c5a5a6\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.216874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca\") pod \"1087ff58-5435-4a42-a022-7950d5c5a5a6\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.216904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config\") pod \"1087ff58-5435-4a42-a022-7950d5c5a5a6\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.216945 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v4hl\" (UniqueName: \"kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl\") pod \"1087ff58-5435-4a42-a022-7950d5c5a5a6\" (UID: \"1087ff58-5435-4a42-a022-7950d5c5a5a6\") " Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.218948 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1087ff58-5435-4a42-a022-7950d5c5a5a6" (UID: "1087ff58-5435-4a42-a022-7950d5c5a5a6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.219930 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "1087ff58-5435-4a42-a022-7950d5c5a5a6" (UID: "1087ff58-5435-4a42-a022-7950d5c5a5a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.220802 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config" (OuterVolumeSpecName: "config") pod "1087ff58-5435-4a42-a022-7950d5c5a5a6" (UID: "1087ff58-5435-4a42-a022-7950d5c5a5a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221248 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221287 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221482 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edbe67e8-c426-4c3e-8f67-4c701df8a860-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221507 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221525 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221544 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1087ff58-5435-4a42-a022-7950d5c5a5a6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221563 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z6kl\" (UniqueName: \"kubernetes.io/projected/edbe67e8-c426-4c3e-8f67-4c701df8a860-kube-api-access-7z6kl\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.221717 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.223137 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.223177 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.223499 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edbe67e8-c426-4c3e-8f67-4c701df8a860-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.224526 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl" (OuterVolumeSpecName: "kube-api-access-7v4hl") pod "1087ff58-5435-4a42-a022-7950d5c5a5a6" (UID: "1087ff58-5435-4a42-a022-7950d5c5a5a6"). InnerVolumeSpecName "kube-api-access-7v4hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.225649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.234430 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1087ff58-5435-4a42-a022-7950d5c5a5a6" (UID: "1087ff58-5435-4a42-a022-7950d5c5a5a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.251993 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf\") pod \"route-controller-manager-86d94d9f8b-bng8h\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.325484 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1087ff58-5435-4a42-a022-7950d5c5a5a6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.325789 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v4hl\" (UniqueName: \"kubernetes.io/projected/1087ff58-5435-4a42-a022-7950d5c5a5a6-kube-api-access-7v4hl\") on node \"crc\" DevicePath \"\"" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.414547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.508536 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.508526 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75568c6556-hrqpp" event={"ID":"1087ff58-5435-4a42-a022-7950d5c5a5a6","Type":"ContainerDied","Data":"5d4a231371fe7d06309e819494a3ceb56adf31e118653a19734e6595e99a29af"} Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.508823 4717 scope.go:117] "RemoveContainer" containerID="44fc040f04921b65bed6a3d9c98aa06cfc6069766ad056042b53f14a97da7a91" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.517377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" event={"ID":"edbe67e8-c426-4c3e-8f67-4c701df8a860","Type":"ContainerDied","Data":"b72c13aac9ad2ad622b63ecfdbdf916e319d4c1c6c3832a3dc10e6bcc6c334ad"} Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.517874 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.542292 4717 scope.go:117] "RemoveContainer" containerID="ac1b35352246e1dcab0f473c4fbf4870b25ca83a3f5f438464543b03af6fafe2" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.571340 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.585965 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75568c6556-hrqpp"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.589114 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.592211 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5668d59bdc-qdbb6"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.646814 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.706181 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.826802 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.923796 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:30:58 crc kubenswrapper[4717]: I0308 05:30:58.976056 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.528541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" event={"ID":"c465a981-50c1-4ecb-98e2-7c0bf1eac868","Type":"ContainerStarted","Data":"1b33eabb299c83ad67b739c3768f6bacad1869ef616a6c9eebd318ac7145ff99"} Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.531812 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sqqgh" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="registry-server" containerID="cri-o://1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6" gracePeriod=2 Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.545079 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.610275 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.794287 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1087ff58-5435-4a42-a022-7950d5c5a5a6" path="/var/lib/kubelet/pods/1087ff58-5435-4a42-a022-7950d5c5a5a6/volumes" Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.795785 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edbe67e8-c426-4c3e-8f67-4c701df8a860" path="/var/lib/kubelet/pods/edbe67e8-c426-4c3e-8f67-4c701df8a860/volumes" Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.828887 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:30:59 crc kubenswrapper[4717]: I0308 05:30:59.829333 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7sb6" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="registry-server" containerID="cri-o://a07ae705cad49c7fb94fb794615bd28936bfa2e46656ac76b94da3a1af96a39f" gracePeriod=2 Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.344835 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:00 crc kubenswrapper[4717]: E0308 05:31:00.345843 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1087ff58-5435-4a42-a022-7950d5c5a5a6" containerName="controller-manager" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.348777 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1087ff58-5435-4a42-a022-7950d5c5a5a6" containerName="controller-manager" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.349296 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1087ff58-5435-4a42-a022-7950d5c5a5a6" containerName="controller-manager" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.350460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.354182 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.354832 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.355370 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.356746 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.357458 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361640 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361645 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfq54\" (UniqueName: \"kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361787 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361823 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.361893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.366575 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.369554 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.463638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.464453 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfq54\" (UniqueName: \"kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.464657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.465000 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.465293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.466708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.466765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.468607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.474667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.495014 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfq54\" (UniqueName: \"kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54\") pod \"controller-manager-5c857fb667-vw5dt\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.542634 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" event={"ID":"c465a981-50c1-4ecb-98e2-7c0bf1eac868","Type":"ContainerStarted","Data":"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20"} Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.542882 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.549180 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.565075 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" podStartSLOduration=3.565040263 podStartE2EDuration="3.565040263s" podCreationTimestamp="2026-03-08 05:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:00.564294294 +0000 UTC m=+287.481943138" watchObservedRunningTime="2026-03-08 05:31:00.565040263 +0000 UTC m=+287.482689117" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.675556 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:00 crc kubenswrapper[4717]: I0308 05:31:00.937602 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.229630 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.230119 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cdf65" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="registry-server" containerID="cri-o://3ace2bbc842356ba5f91bd1e3c708f66b403d43c6e5c692dc067cae8033fdb39" gracePeriod=2 Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.557883 4717 generic.go:334] "Generic (PLEG): container finished" podID="724eb749-c200-4929-9a63-f3384e410a6f" containerID="a07ae705cad49c7fb94fb794615bd28936bfa2e46656ac76b94da3a1af96a39f" exitCode=0 Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.558020 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerDied","Data":"a07ae705cad49c7fb94fb794615bd28936bfa2e46656ac76b94da3a1af96a39f"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.558314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7sb6" event={"ID":"724eb749-c200-4929-9a63-f3384e410a6f","Type":"ContainerDied","Data":"1fc7fc6f8869cf3aa98971a73536a89069ae40d88c877ee42f3b36f635cb8b77"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.558331 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc7fc6f8869cf3aa98971a73536a89069ae40d88c877ee42f3b36f635cb8b77" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.559579 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.560725 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" event={"ID":"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0","Type":"ContainerStarted","Data":"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.560776 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" event={"ID":"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0","Type":"ContainerStarted","Data":"8dd340975b0a5545b168e6e11ec3f1bfc828e4814758860c5e5e356c929ea81e"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.560998 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.562979 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.563534 4717 generic.go:334] "Generic (PLEG): container finished" podID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerID="1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6" exitCode=0 Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.563583 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerDied","Data":"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.563606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqqgh" event={"ID":"64d82598-c4ba-4e83-8810-c4b9ad5b2f51","Type":"ContainerDied","Data":"05fb64367a631eaefb89746bef48fa016400cdbb024195db65fb0616751c59ef"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.563624 4717 scope.go:117] "RemoveContainer" containerID="1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.563738 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqqgh" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.567819 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.573180 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerID="3ace2bbc842356ba5f91bd1e3c708f66b403d43c6e5c692dc067cae8033fdb39" exitCode=0 Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.573905 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerDied","Data":"3ace2bbc842356ba5f91bd1e3c708f66b403d43c6e5c692dc067cae8033fdb39"} Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities\") pod \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590290 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content\") pod \"724eb749-c200-4929-9a63-f3384e410a6f\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590391 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j79qz\" (UniqueName: \"kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz\") pod \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590445 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities\") pod \"724eb749-c200-4929-9a63-f3384e410a6f\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590468 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qprdm\" (UniqueName: \"kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm\") pod \"724eb749-c200-4929-9a63-f3384e410a6f\" (UID: \"724eb749-c200-4929-9a63-f3384e410a6f\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.590513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content\") pod \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\" (UID: \"64d82598-c4ba-4e83-8810-c4b9ad5b2f51\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.591312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities" (OuterVolumeSpecName: "utilities") pod "64d82598-c4ba-4e83-8810-c4b9ad5b2f51" (UID: "64d82598-c4ba-4e83-8810-c4b9ad5b2f51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.592671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities" (OuterVolumeSpecName: "utilities") pod "724eb749-c200-4929-9a63-f3384e410a6f" (UID: "724eb749-c200-4929-9a63-f3384e410a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.602842 4717 scope.go:117] "RemoveContainer" containerID="96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.603559 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz" (OuterVolumeSpecName: "kube-api-access-j79qz") pod "64d82598-c4ba-4e83-8810-c4b9ad5b2f51" (UID: "64d82598-c4ba-4e83-8810-c4b9ad5b2f51"). InnerVolumeSpecName "kube-api-access-j79qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.609952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm" (OuterVolumeSpecName: "kube-api-access-qprdm") pod "724eb749-c200-4929-9a63-f3384e410a6f" (UID: "724eb749-c200-4929-9a63-f3384e410a6f"). InnerVolumeSpecName "kube-api-access-qprdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.657480 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" podStartSLOduration=4.657457123 podStartE2EDuration="4.657457123s" podCreationTimestamp="2026-03-08 05:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:01.653291544 +0000 UTC m=+288.570940388" watchObservedRunningTime="2026-03-08 05:31:01.657457123 +0000 UTC m=+288.575105957" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.658617 4717 scope.go:117] "RemoveContainer" containerID="5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692093 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "724eb749-c200-4929-9a63-f3384e410a6f" (UID: "724eb749-c200-4929-9a63-f3384e410a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692369 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692383 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692394 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j79qz\" (UniqueName: \"kubernetes.io/projected/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-kube-api-access-j79qz\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692405 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/724eb749-c200-4929-9a63-f3384e410a6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.692414 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qprdm\" (UniqueName: \"kubernetes.io/projected/724eb749-c200-4929-9a63-f3384e410a6f-kube-api-access-qprdm\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.696819 4717 scope.go:117] "RemoveContainer" containerID="1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6" Mar 08 05:31:01 crc kubenswrapper[4717]: E0308 05:31:01.697406 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6\": container with ID starting with 1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6 not found: ID does not exist" containerID="1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.697452 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6"} err="failed to get container status \"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6\": rpc error: code = NotFound desc = could not find container \"1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6\": container with ID starting with 1d99a291ddcf76b4b1057f9ffd6f82b92405ad42a48211a9e4ab93cdca611af6 not found: ID does not exist" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.697473 4717 scope.go:117] "RemoveContainer" containerID="96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0" Mar 08 05:31:01 crc kubenswrapper[4717]: E0308 05:31:01.697836 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0\": container with ID starting with 96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0 not found: ID does not exist" containerID="96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.697886 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0"} err="failed to get container status \"96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0\": rpc error: code = NotFound desc = could not find container \"96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0\": container with ID starting with 96d7ccde0371a5a706c3477a58848f91f5f804f7239123d813e1983adac2d1c0 not found: ID does not exist" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.697927 4717 scope.go:117] "RemoveContainer" containerID="5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30" Mar 08 05:31:01 crc kubenswrapper[4717]: E0308 05:31:01.698191 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30\": container with ID starting with 5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30 not found: ID does not exist" containerID="5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.698224 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30"} err="failed to get container status \"5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30\": rpc error: code = NotFound desc = could not find container \"5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30\": container with ID starting with 5bfa7b11e99cc1f543c806ad6daaba4a0d229b0b0c84adc05945fd67e3a7fa30 not found: ID does not exist" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.714658 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64d82598-c4ba-4e83-8810-c4b9ad5b2f51" (UID: "64d82598-c4ba-4e83-8810-c4b9ad5b2f51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.794167 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64d82598-c4ba-4e83-8810-c4b9ad5b2f51-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.860846 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.894816 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities\") pod \"eb6549f4-05b3-4309-b7f4-3b34fe523413\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.895640 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities" (OuterVolumeSpecName: "utilities") pod "eb6549f4-05b3-4309-b7f4-3b34fe523413" (UID: "eb6549f4-05b3-4309-b7f4-3b34fe523413"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.895845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9zzs\" (UniqueName: \"kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs\") pod \"eb6549f4-05b3-4309-b7f4-3b34fe523413\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.895996 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content\") pod \"eb6549f4-05b3-4309-b7f4-3b34fe523413\" (UID: \"eb6549f4-05b3-4309-b7f4-3b34fe523413\") " Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.897637 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.899649 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs" (OuterVolumeSpecName: "kube-api-access-n9zzs") pod "eb6549f4-05b3-4309-b7f4-3b34fe523413" (UID: "eb6549f4-05b3-4309-b7f4-3b34fe523413"). InnerVolumeSpecName "kube-api-access-n9zzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.908466 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.908517 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9zzs\" (UniqueName: \"kubernetes.io/projected/eb6549f4-05b3-4309-b7f4-3b34fe523413-kube-api-access-n9zzs\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:01 crc kubenswrapper[4717]: I0308 05:31:01.909700 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sqqgh"] Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.130578 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb6549f4-05b3-4309-b7f4-3b34fe523413" (UID: "eb6549f4-05b3-4309-b7f4-3b34fe523413"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.213141 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6549f4-05b3-4309-b7f4-3b34fe523413-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341106 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-984c8fd85-hf6fn"] Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341388 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341401 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341421 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341427 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341436 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341443 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341454 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341462 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341472 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341479 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="extract-content" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341501 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341508 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341518 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341525 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341534 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341541 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: E0308 05:31:02.341550 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341558 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="extract-utilities" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341706 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341721 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="724eb749-c200-4929-9a63-f3384e410a6f" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.341733 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" containerName="registry-server" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.342213 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.348623 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.348768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.348896 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.349703 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.349923 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.350029 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.350124 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.350301 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.350532 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.352383 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.352979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.356016 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.360825 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.371796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.374166 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.402982 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-984c8fd85-hf6fn"] Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.416926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-login\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.416987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-policies\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417015 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-router-certs\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-service-ca\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417107 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-dir\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417184 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-error\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-session\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417294 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5qd\" (UniqueName: \"kubernetes.io/projected/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-kube-api-access-xr5qd\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.417357 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-error\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526852 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526890 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-session\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526916 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5qd\" (UniqueName: \"kubernetes.io/projected/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-kube-api-access-xr5qd\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.526978 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527015 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-login\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527086 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-policies\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-router-certs\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-service-ca\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-dir\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.527234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.529578 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-policies\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.529707 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-audit-dir\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.530702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.530825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-service-ca\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.531195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.533522 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-login\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.534253 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.534490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-router-certs\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.535310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-session\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.535797 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-template-error\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.536202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.542964 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.544254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.547417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5qd\" (UniqueName: \"kubernetes.io/projected/b686deae-0aec-4a5a-bb26-4b3fa99b3dd5-kube-api-access-xr5qd\") pod \"oauth-openshift-984c8fd85-hf6fn\" (UID: \"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5\") " pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.585857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdf65" event={"ID":"eb6549f4-05b3-4309-b7f4-3b34fe523413","Type":"ContainerDied","Data":"25f59b673485b2e88ed5ca9ba168a3145b3fe2694c8f452a945cc456c9f39c0d"} Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.585928 4717 scope.go:117] "RemoveContainer" containerID="3ace2bbc842356ba5f91bd1e3c708f66b403d43c6e5c692dc067cae8033fdb39" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.585965 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdf65" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.586922 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7sb6" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.602861 4717 scope.go:117] "RemoveContainer" containerID="585a984042a44528285f33a5892346340b8be4dbe691652be068e68a72eebd86" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.617675 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.640804 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7sb6"] Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.647178 4717 scope.go:117] "RemoveContainer" containerID="177d6bd85165a28b2665e2355cfa761222f200bd9e49303f1f61f66f301ba8b9" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.654733 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.661178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:02 crc kubenswrapper[4717]: I0308 05:31:02.666318 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdf65"] Mar 08 05:31:03 crc kubenswrapper[4717]: I0308 05:31:03.125352 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-984c8fd85-hf6fn"] Mar 08 05:31:03 crc kubenswrapper[4717]: W0308 05:31:03.139923 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb686deae_0aec_4a5a_bb26_4b3fa99b3dd5.slice/crio-9dc09bc537f351d25e31f736112402aca9ec134e7227e7a80ab5e5f2f478390f WatchSource:0}: Error finding container 9dc09bc537f351d25e31f736112402aca9ec134e7227e7a80ab5e5f2f478390f: Status 404 returned error can't find the container with id 9dc09bc537f351d25e31f736112402aca9ec134e7227e7a80ab5e5f2f478390f Mar 08 05:31:03 crc kubenswrapper[4717]: I0308 05:31:03.596761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" event={"ID":"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5","Type":"ContainerStarted","Data":"9dc09bc537f351d25e31f736112402aca9ec134e7227e7a80ab5e5f2f478390f"} Mar 08 05:31:03 crc kubenswrapper[4717]: I0308 05:31:03.799903 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d82598-c4ba-4e83-8810-c4b9ad5b2f51" path="/var/lib/kubelet/pods/64d82598-c4ba-4e83-8810-c4b9ad5b2f51/volumes" Mar 08 05:31:03 crc kubenswrapper[4717]: I0308 05:31:03.801782 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724eb749-c200-4929-9a63-f3384e410a6f" path="/var/lib/kubelet/pods/724eb749-c200-4929-9a63-f3384e410a6f/volumes" Mar 08 05:31:03 crc kubenswrapper[4717]: I0308 05:31:03.803162 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6549f4-05b3-4309-b7f4-3b34fe523413" path="/var/lib/kubelet/pods/eb6549f4-05b3-4309-b7f4-3b34fe523413/volumes" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.121005 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.121630 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.121966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.123531 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.123875 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c" gracePeriod=600 Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.612513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" event={"ID":"b686deae-0aec-4a5a-bb26-4b3fa99b3dd5","Type":"ContainerStarted","Data":"584a4d03d1594281aa554b62c2a2cab9e5539c13322afdb06e356ce85a7c70bb"} Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.613153 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.649827 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" podStartSLOduration=35.649794579 podStartE2EDuration="35.649794579s" podCreationTimestamp="2026-03-08 05:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:04.648174786 +0000 UTC m=+291.565823630" watchObservedRunningTime="2026-03-08 05:31:04.649794579 +0000 UTC m=+291.567443423" Mar 08 05:31:04 crc kubenswrapper[4717]: I0308 05:31:04.859385 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-984c8fd85-hf6fn" Mar 08 05:31:05 crc kubenswrapper[4717]: I0308 05:31:05.642426 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c" exitCode=0 Mar 08 05:31:05 crc kubenswrapper[4717]: I0308 05:31:05.642530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c"} Mar 08 05:31:05 crc kubenswrapper[4717]: I0308 05:31:05.643084 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79"} Mar 08 05:31:16 crc kubenswrapper[4717]: I0308 05:31:16.986797 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:16 crc kubenswrapper[4717]: I0308 05:31:16.988218 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" podUID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" containerName="controller-manager" containerID="cri-o://44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77" gracePeriod=30 Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.079766 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.080088 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" podUID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" containerName="route-controller-manager" containerID="cri-o://3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20" gracePeriod=30 Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.629746 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.637281 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.686029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config\") pod \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.686101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf\") pod \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.686128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles\") pod \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.686171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert\") pod \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.687427 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" (UID: "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.687621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config" (OuterVolumeSpecName: "config") pod "c465a981-50c1-4ecb-98e2-7c0bf1eac868" (UID: "c465a981-50c1-4ecb-98e2-7c0bf1eac868"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.696961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" (UID: "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.697073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf" (OuterVolumeSpecName: "kube-api-access-gzdzf") pod "c465a981-50c1-4ecb-98e2-7c0bf1eac868" (UID: "c465a981-50c1-4ecb-98e2-7c0bf1eac868"). InnerVolumeSpecName "kube-api-access-gzdzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.740816 4717 generic.go:334] "Generic (PLEG): container finished" podID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" containerID="44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77" exitCode=0 Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.740928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" event={"ID":"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0","Type":"ContainerDied","Data":"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77"} Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.740973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" event={"ID":"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0","Type":"ContainerDied","Data":"8dd340975b0a5545b168e6e11ec3f1bfc828e4814758860c5e5e356c929ea81e"} Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.741000 4717 scope.go:117] "RemoveContainer" containerID="44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.741150 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c857fb667-vw5dt" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.744216 4717 generic.go:334] "Generic (PLEG): container finished" podID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" containerID="3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20" exitCode=0 Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.744247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" event={"ID":"c465a981-50c1-4ecb-98e2-7c0bf1eac868","Type":"ContainerDied","Data":"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20"} Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.744267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" event={"ID":"c465a981-50c1-4ecb-98e2-7c0bf1eac868","Type":"ContainerDied","Data":"1b33eabb299c83ad67b739c3768f6bacad1869ef616a6c9eebd318ac7145ff99"} Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.744311 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.765176 4717 scope.go:117] "RemoveContainer" containerID="44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77" Mar 08 05:31:17 crc kubenswrapper[4717]: E0308 05:31:17.765727 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77\": container with ID starting with 44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77 not found: ID does not exist" containerID="44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.765764 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77"} err="failed to get container status \"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77\": rpc error: code = NotFound desc = could not find container \"44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77\": container with ID starting with 44d37b1841b6001fdceedfb4b75a0124787ab102d9d06f029b5a51b0a6835d77 not found: ID does not exist" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.765787 4717 scope.go:117] "RemoveContainer" containerID="3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786644 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfq54\" (UniqueName: \"kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54\") pod \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786705 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca\") pod \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert\") pod \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca\") pod \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\" (UID: \"c465a981-50c1-4ecb-98e2-7c0bf1eac868\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786789 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config\") pod \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\" (UID: \"eae32cbd-dbeb-4be9-afd0-e15ea2039fc0\") " Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786967 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786981 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzdzf\" (UniqueName: \"kubernetes.io/projected/c465a981-50c1-4ecb-98e2-7c0bf1eac868-kube-api-access-gzdzf\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.786991 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.787000 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.787310 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca" (OuterVolumeSpecName: "client-ca") pod "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" (UID: "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.787654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config" (OuterVolumeSpecName: "config") pod "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" (UID: "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.788082 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca" (OuterVolumeSpecName: "client-ca") pod "c465a981-50c1-4ecb-98e2-7c0bf1eac868" (UID: "c465a981-50c1-4ecb-98e2-7c0bf1eac868"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.788204 4717 scope.go:117] "RemoveContainer" containerID="3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20" Mar 08 05:31:17 crc kubenswrapper[4717]: E0308 05:31:17.789066 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20\": container with ID starting with 3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20 not found: ID does not exist" containerID="3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.789103 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20"} err="failed to get container status \"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20\": rpc error: code = NotFound desc = could not find container \"3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20\": container with ID starting with 3ea21861ecfd948b979a758b5d500e70396f547aaddef6782bf730cea8ffad20 not found: ID does not exist" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.790292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54" (OuterVolumeSpecName: "kube-api-access-rfq54") pod "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" (UID: "eae32cbd-dbeb-4be9-afd0-e15ea2039fc0"). InnerVolumeSpecName "kube-api-access-rfq54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.790914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c465a981-50c1-4ecb-98e2-7c0bf1eac868" (UID: "c465a981-50c1-4ecb-98e2-7c0bf1eac868"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.888401 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c465a981-50c1-4ecb-98e2-7c0bf1eac868-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.888466 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c465a981-50c1-4ecb-98e2-7c0bf1eac868-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.888475 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.888485 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfq54\" (UniqueName: \"kubernetes.io/projected/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-kube-api-access-rfq54\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:17 crc kubenswrapper[4717]: I0308 05:31:17.888500 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.061959 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.071161 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c857fb667-vw5dt"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.077865 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.088084 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d94d9f8b-bng8h"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.356563 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2"] Mar 08 05:31:18 crc kubenswrapper[4717]: E0308 05:31:18.356921 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" containerName="controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.356948 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" containerName="controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: E0308 05:31:18.356978 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" containerName="route-controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.356988 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" containerName="route-controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.357108 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" containerName="controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.357127 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" containerName="route-controller-manager" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.357701 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.361062 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.361362 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.362064 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79ff68f84c-kj8xm"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.362123 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.362153 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.362934 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.364762 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.365660 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.367941 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.368251 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.368419 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.368424 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.368796 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.369323 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.375603 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.393407 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79ff68f84c-kj8xm"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.394466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tzx\" (UniqueName: \"kubernetes.io/projected/89c4be73-2f2f-4703-b556-2a51a7962f81-kube-api-access-g9tzx\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.394582 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnzl\" (UniqueName: \"kubernetes.io/projected/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-kube-api-access-fxnzl\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.394711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-client-ca\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.394816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89c4be73-2f2f-4703-b556-2a51a7962f81-serving-cert\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.394954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-client-ca\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.395042 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-config\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.395128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-serving-cert\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.395220 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-config\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.395307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-proxy-ca-bundles\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.441790 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2"] Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496290 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89c4be73-2f2f-4703-b556-2a51a7962f81-serving-cert\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496379 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-client-ca\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-config\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496484 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-serving-cert\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-config\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-proxy-ca-bundles\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tzx\" (UniqueName: \"kubernetes.io/projected/89c4be73-2f2f-4703-b556-2a51a7962f81-kube-api-access-g9tzx\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496749 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnzl\" (UniqueName: \"kubernetes.io/projected/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-kube-api-access-fxnzl\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.496799 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-client-ca\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.497780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-client-ca\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.498329 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-client-ca\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.498428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c4be73-2f2f-4703-b556-2a51a7962f81-config\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.498892 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-config\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.500650 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-proxy-ca-bundles\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.501202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89c4be73-2f2f-4703-b556-2a51a7962f81-serving-cert\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.505450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-serving-cert\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.513900 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tzx\" (UniqueName: \"kubernetes.io/projected/89c4be73-2f2f-4703-b556-2a51a7962f81-kube-api-access-g9tzx\") pod \"route-controller-manager-7d6d57bb45-rxqz2\" (UID: \"89c4be73-2f2f-4703-b556-2a51a7962f81\") " pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.520886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnzl\" (UniqueName: \"kubernetes.io/projected/5f6f2c16-ce12-453a-a8cd-80cc875c17b2-kube-api-access-fxnzl\") pod \"controller-manager-79ff68f84c-kj8xm\" (UID: \"5f6f2c16-ce12-453a-a8cd-80cc875c17b2\") " pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.676315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:18 crc kubenswrapper[4717]: I0308 05:31:18.688058 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.178302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2"] Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.237122 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79ff68f84c-kj8xm"] Mar 08 05:31:19 crc kubenswrapper[4717]: W0308 05:31:19.244098 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6f2c16_ce12_453a_a8cd_80cc875c17b2.slice/crio-e4bf29e433c7e24673188c7131db6e33a4d867680ef734d42dee2a69e59e3cd0 WatchSource:0}: Error finding container e4bf29e433c7e24673188c7131db6e33a4d867680ef734d42dee2a69e59e3cd0: Status 404 returned error can't find the container with id e4bf29e433c7e24673188c7131db6e33a4d867680ef734d42dee2a69e59e3cd0 Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.707931 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709059 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709249 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709446 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79" gracePeriod=15 Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709584 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709560 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772" gracePeriod=15 Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709560 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446" gracePeriod=15 Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709576 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3" gracePeriod=15 Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709619 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9" gracePeriod=15 Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.709922 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709942 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.709952 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709960 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.709970 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709976 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.709983 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.709988 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.709996 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710002 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.710019 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710024 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.710032 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710039 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.710047 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710057 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.710066 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710072 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 05:31:19 crc kubenswrapper[4717]: E0308 05:31:19.710082 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710088 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710186 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710196 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710203 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710210 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710217 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710224 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710230 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710254 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.710437 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716182 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.716569 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.750225 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.765788 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.777875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" event={"ID":"5f6f2c16-ce12-453a-a8cd-80cc875c17b2","Type":"ContainerStarted","Data":"80f98d15d00d59cde15ff9ba8aeb84fd423bb79487796e14ee1f891230ed2cdc"} Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.777948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" event={"ID":"5f6f2c16-ce12-453a-a8cd-80cc875c17b2","Type":"ContainerStarted","Data":"e4bf29e433c7e24673188c7131db6e33a4d867680ef734d42dee2a69e59e3cd0"} Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.778289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.780201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" event={"ID":"89c4be73-2f2f-4703-b556-2a51a7962f81","Type":"ContainerStarted","Data":"4328749d9dd1beeb2ffd0423e79ccb902280a5dee5c3cbcc430be19518170196"} Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.780249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" event={"ID":"89c4be73-2f2f-4703-b556-2a51a7962f81","Type":"ContainerStarted","Data":"65416c82c682a395580b7d5769d3db3beb333ad700cb7fceaf83ce9fd733c458"} Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.780420 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.787798 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c465a981-50c1-4ecb-98e2-7c0bf1eac868" path="/var/lib/kubelet/pods/c465a981-50c1-4ecb-98e2-7c0bf1eac868/volumes" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.788594 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae32cbd-dbeb-4be9-afd0-e15ea2039fc0" path="/var/lib/kubelet/pods/eae32cbd-dbeb-4be9-afd0-e15ea2039fc0/volumes" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.789118 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818673 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818785 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.818976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.819987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.821111 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.821486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.821513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.821536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.822147 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:19 crc kubenswrapper[4717]: I0308 05:31:19.822424 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.063839 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:31:20 crc kubenswrapper[4717]: W0308 05:31:20.084837 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d2e51338011b29f2bbf97e3d70ac9f80e9098da0b8bee23ac28d62367c436a99 WatchSource:0}: Error finding container d2e51338011b29f2bbf97e3d70ac9f80e9098da0b8bee23ac28d62367c436a99: Status 404 returned error can't find the container with id d2e51338011b29f2bbf97e3d70ac9f80e9098da0b8bee23ac28d62367c436a99 Mar 08 05:31:20 crc kubenswrapper[4717]: E0308 05:31:20.088389 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ac6b415e7b288 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,LastTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.780464 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.781132 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.790841 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.793391 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.794519 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446" exitCode=0 Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.794572 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772" exitCode=0 Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.794617 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3" exitCode=0 Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.794636 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9" exitCode=2 Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.794646 4717 scope.go:117] "RemoveContainer" containerID="541e240c20237f43ff721f1eba3e5cfdee21355c9c497975747cc76880b85965" Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.799292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1376032517a22727c50ed3c89724b142db50e07138392c534e6b54f4b079198b"} Mar 08 05:31:20 crc kubenswrapper[4717]: I0308 05:31:20.799364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d2e51338011b29f2bbf97e3d70ac9f80e9098da0b8bee23ac28d62367c436a99"} Mar 08 05:31:21 crc kubenswrapper[4717]: I0308 05:31:21.805406 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:21 crc kubenswrapper[4717]: I0308 05:31:21.805955 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:21 crc kubenswrapper[4717]: I0308 05:31:21.808840 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.303484 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.304800 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459497 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459593 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459639 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459800 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459972 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459987 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.459997 4717 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.821897 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.823033 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79" exitCode=0 Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.823136 4717 scope.go:117] "RemoveContainer" containerID="a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.823168 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.857083 4717 scope.go:117] "RemoveContainer" containerID="aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.899163 4717 scope.go:117] "RemoveContainer" containerID="527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.944546 4717 scope.go:117] "RemoveContainer" containerID="e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9" Mar 08 05:31:22 crc kubenswrapper[4717]: I0308 05:31:22.975497 4717 scope.go:117] "RemoveContainer" containerID="6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.008573 4717 scope.go:117] "RemoveContainer" containerID="7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.046351 4717 scope.go:117] "RemoveContainer" containerID="a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.047490 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\": container with ID starting with a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446 not found: ID does not exist" containerID="a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.047628 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446"} err="failed to get container status \"a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\": rpc error: code = NotFound desc = could not find container \"a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446\": container with ID starting with a163d92ae470200d1394891c5c95da789193cb488ef23066958191f889ad5446 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.047673 4717 scope.go:117] "RemoveContainer" containerID="aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.049708 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\": container with ID starting with aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772 not found: ID does not exist" containerID="aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.049792 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772"} err="failed to get container status \"aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\": rpc error: code = NotFound desc = could not find container \"aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772\": container with ID starting with aee97a4f30e9838528ca2c393ed085915775d55e929c6669167b9fdcbe661772 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.049846 4717 scope.go:117] "RemoveContainer" containerID="527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.051254 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\": container with ID starting with 527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3 not found: ID does not exist" containerID="527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.051403 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3"} err="failed to get container status \"527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\": rpc error: code = NotFound desc = could not find container \"527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3\": container with ID starting with 527467667a3aaf6fd12d3a76ca1eccad74a9deb57e212791ada669d246e4d5e3 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.051948 4717 scope.go:117] "RemoveContainer" containerID="e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.053855 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\": container with ID starting with e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9 not found: ID does not exist" containerID="e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.053983 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9"} err="failed to get container status \"e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\": rpc error: code = NotFound desc = could not find container \"e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9\": container with ID starting with e6785d67b6037d51eeaa17807f6aefb45d951f6d5f7f2cc4540d0b6fc131fbc9 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.054027 4717 scope.go:117] "RemoveContainer" containerID="6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.054852 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\": container with ID starting with 6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79 not found: ID does not exist" containerID="6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.054914 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79"} err="failed to get container status \"6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\": rpc error: code = NotFound desc = could not find container \"6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79\": container with ID starting with 6a5475211c85f699b65ff30f4f978e90a6fc4a21d88a89709bffee18285daa79 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.054951 4717 scope.go:117] "RemoveContainer" containerID="7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8" Mar 08 05:31:23 crc kubenswrapper[4717]: E0308 05:31:23.055429 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\": container with ID starting with 7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8 not found: ID does not exist" containerID="7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.055556 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8"} err="failed to get container status \"7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\": rpc error: code = NotFound desc = could not find container \"7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8\": container with ID starting with 7d53ff054fb726abd1d52681074d1ae302cb7c5cac74661f38dc6249a10df7d8 not found: ID does not exist" Mar 08 05:31:23 crc kubenswrapper[4717]: I0308 05:31:23.800837 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.848649 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.849362 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.850413 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.851170 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.861833 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.862425 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:24 crc kubenswrapper[4717]: I0308 05:31:24.863001 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:25 crc kubenswrapper[4717]: E0308 05:31:25.024101 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ac6b415e7b288 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,LastTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.854078 4717 generic.go:334] "Generic (PLEG): container finished" podID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" containerID="59cd832ba158a57bd24080c46c76326c92c95d9dbf2ea262f8b7e79ff0b6bfc3" exitCode=0 Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.854204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84","Type":"ContainerDied","Data":"59cd832ba158a57bd24080c46c76326c92c95d9dbf2ea262f8b7e79ff0b6bfc3"} Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.859263 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.860287 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.860981 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:25 crc kubenswrapper[4717]: I0308 05:31:25.861815 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.305894 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.307472 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.308283 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.308912 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.309504 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.357100 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access\") pod \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.357228 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir\") pod \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.357256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock\") pod \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\" (UID: \"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84\") " Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.357391 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" (UID: "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.357471 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock" (OuterVolumeSpecName: "var-lock") pod "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" (UID: "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.368964 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" (UID: "b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.459587 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.459638 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.459662 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.875931 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84","Type":"ContainerDied","Data":"3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5"} Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.876012 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3384aee515161a59ecf6a7cd298815beafe642cba4e129c42f66d9adaa96ade5" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.876033 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.883822 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.884426 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.885066 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: I0308 05:31:27.885870 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: E0308 05:31:27.998185 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: E0308 05:31:27.999067 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:27 crc kubenswrapper[4717]: E0308 05:31:27.999807 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:28 crc kubenswrapper[4717]: E0308 05:31:28.000311 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:28 crc kubenswrapper[4717]: E0308 05:31:28.000788 4717 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:28 crc kubenswrapper[4717]: I0308 05:31:28.000852 4717 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 05:31:28 crc kubenswrapper[4717]: E0308 05:31:28.001344 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Mar 08 05:31:28 crc kubenswrapper[4717]: E0308 05:31:28.203113 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Mar 08 05:31:28 crc kubenswrapper[4717]: E0308 05:31:28.604240 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Mar 08 05:31:29 crc kubenswrapper[4717]: E0308 05:31:29.405894 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Mar 08 05:31:29 crc kubenswrapper[4717]: I0308 05:31:29.678059 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:29 crc kubenswrapper[4717]: I0308 05:31:29.678221 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:31 crc kubenswrapper[4717]: E0308 05:31:31.006832 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.781034 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.786521 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.787301 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.788296 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.789281 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.790251 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.790808 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.791375 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.791948 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.812473 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.813010 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:33 crc kubenswrapper[4717]: E0308 05:31:33.813488 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.814469 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:33 crc kubenswrapper[4717]: W0308 05:31:33.853514 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-bc7cc0dada71cc08bd74fe22df8a16a4c41e6f1fc6ea8ba2b6accebe8d4eb5b3 WatchSource:0}: Error finding container bc7cc0dada71cc08bd74fe22df8a16a4c41e6f1fc6ea8ba2b6accebe8d4eb5b3: Status 404 returned error can't find the container with id bc7cc0dada71cc08bd74fe22df8a16a4c41e6f1fc6ea8ba2b6accebe8d4eb5b3 Mar 08 05:31:33 crc kubenswrapper[4717]: I0308 05:31:33.928474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc7cc0dada71cc08bd74fe22df8a16a4c41e6f1fc6ea8ba2b6accebe8d4eb5b3"} Mar 08 05:31:34 crc kubenswrapper[4717]: E0308 05:31:34.209042 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="6.4s" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.942404 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.943860 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.944005 4717 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391" exitCode=1 Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.944138 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391"} Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.945196 4717 scope.go:117] "RemoveContainer" containerID="852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.945932 4717 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.946606 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947109 4717 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6a57529890aa671d0c9716372423ac940ccf5f56b260c8471862b3b7ebcd4763" exitCode=0 Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6a57529890aa671d0c9716372423ac940ccf5f56b260c8471862b3b7ebcd4763"} Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947203 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947629 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947666 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.947977 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: E0308 05:31:34.948239 4717 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.948455 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.949007 4717 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.949386 4717 status_manager.go:851] "Failed to get status for pod" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.949909 4717 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.950778 4717 status_manager.go:851] "Failed to get status for pod" podUID="5f6f2c16-ce12-453a-a8cd-80cc875c17b2" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-79ff68f84c-kj8xm\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:34 crc kubenswrapper[4717]: I0308 05:31:34.952119 4717 status_manager.go:851] "Failed to get status for pod" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7d6d57bb45-rxqz2\": dial tcp 38.102.83.44:6443: connect: connection refused" Mar 08 05:31:35 crc kubenswrapper[4717]: E0308 05:31:35.026262 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ac6b415e7b288 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,LastTimestamp:2026-03-08 05:31:20.087630472 +0000 UTC m=+307.005279316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.969283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f82c10c3bac4b8b6353fdc976126d4b8ee7f0c6539c159b0137883e767a7ffa4"} Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.969748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"689cf36918c750c9dc0d383b2b54a2ff257900bd8c8cf518e4a5f4fc61fca400"} Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.969762 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e4d63196e464f29f79964ee52a3298822bf6872c8cfab7af52c8342c8a574687"} Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.982791 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.984053 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 05:31:35 crc kubenswrapper[4717]: I0308 05:31:35.984160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28fee218042f2df9f376acbd408b9ee473809f802c4b6bd73eedfa49b59537c8"} Mar 08 05:31:36 crc kubenswrapper[4717]: I0308 05:31:36.996295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17822cd740928afefe0bedc4cc0cb7294f7ec24dc521ea847f2a848b16cf438a"} Mar 08 05:31:36 crc kubenswrapper[4717]: I0308 05:31:36.996723 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:36 crc kubenswrapper[4717]: I0308 05:31:36.996735 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73055aa1e21a660134944ba85a845a22909e5608b2b74bd5726b552987e81cb9"} Mar 08 05:31:36 crc kubenswrapper[4717]: I0308 05:31:36.996630 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:36 crc kubenswrapper[4717]: I0308 05:31:36.996757 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:38 crc kubenswrapper[4717]: I0308 05:31:38.419192 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:31:38 crc kubenswrapper[4717]: I0308 05:31:38.815623 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:38 crc kubenswrapper[4717]: I0308 05:31:38.815729 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:38 crc kubenswrapper[4717]: I0308 05:31:38.824001 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:39 crc kubenswrapper[4717]: I0308 05:31:39.397264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:31:39 crc kubenswrapper[4717]: I0308 05:31:39.397439 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 05:31:39 crc kubenswrapper[4717]: I0308 05:31:39.398797 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 05:31:39 crc kubenswrapper[4717]: I0308 05:31:39.678259 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:39 crc kubenswrapper[4717]: I0308 05:31:39.678379 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:42 crc kubenswrapper[4717]: I0308 05:31:42.012432 4717 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:43 crc kubenswrapper[4717]: I0308 05:31:43.040663 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:43 crc kubenswrapper[4717]: I0308 05:31:43.040760 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:43 crc kubenswrapper[4717]: I0308 05:31:43.053104 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:43 crc kubenswrapper[4717]: I0308 05:31:43.812611 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="720cab69-3233-435a-963c-44436cd53d08" Mar 08 05:31:44 crc kubenswrapper[4717]: I0308 05:31:44.048924 4717 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:44 crc kubenswrapper[4717]: I0308 05:31:44.048978 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc8bca0b-3590-4748-8dc9-d659f09631bd" Mar 08 05:31:44 crc kubenswrapper[4717]: I0308 05:31:44.054101 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="720cab69-3233-435a-963c-44436cd53d08" Mar 08 05:31:49 crc kubenswrapper[4717]: I0308 05:31:49.397408 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 05:31:49 crc kubenswrapper[4717]: I0308 05:31:49.398042 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 05:31:49 crc kubenswrapper[4717]: I0308 05:31:49.677307 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:49 crc kubenswrapper[4717]: I0308 05:31:49.677453 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:50 crc kubenswrapper[4717]: I0308 05:31:50.104863 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7d6d57bb45-rxqz2_89c4be73-2f2f-4703-b556-2a51a7962f81/route-controller-manager/0.log" Mar 08 05:31:50 crc kubenswrapper[4717]: I0308 05:31:50.105420 4717 generic.go:334] "Generic (PLEG): container finished" podID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerID="4328749d9dd1beeb2ffd0423e79ccb902280a5dee5c3cbcc430be19518170196" exitCode=255 Mar 08 05:31:50 crc kubenswrapper[4717]: I0308 05:31:50.105475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" event={"ID":"89c4be73-2f2f-4703-b556-2a51a7962f81","Type":"ContainerDied","Data":"4328749d9dd1beeb2ffd0423e79ccb902280a5dee5c3cbcc430be19518170196"} Mar 08 05:31:50 crc kubenswrapper[4717]: I0308 05:31:50.106381 4717 scope.go:117] "RemoveContainer" containerID="4328749d9dd1beeb2ffd0423e79ccb902280a5dee5c3cbcc430be19518170196" Mar 08 05:31:51 crc kubenswrapper[4717]: I0308 05:31:51.120233 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7d6d57bb45-rxqz2_89c4be73-2f2f-4703-b556-2a51a7962f81/route-controller-manager/0.log" Mar 08 05:31:51 crc kubenswrapper[4717]: I0308 05:31:51.120730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" event={"ID":"89c4be73-2f2f-4703-b556-2a51a7962f81","Type":"ContainerStarted","Data":"a61dd7e6b60ac9787b159cb599f68ccbba42ff54f317fd9b6588df07f8e28512"} Mar 08 05:31:51 crc kubenswrapper[4717]: I0308 05:31:51.121575 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:31:52 crc kubenswrapper[4717]: I0308 05:31:52.121543 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:52 crc kubenswrapper[4717]: I0308 05:31:52.121656 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:52 crc kubenswrapper[4717]: I0308 05:31:52.160021 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 05:31:52 crc kubenswrapper[4717]: I0308 05:31:52.531390 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 05:31:52 crc kubenswrapper[4717]: I0308 05:31:52.654605 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.093449 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.129797 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.129909 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.261491 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.398490 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 05:31:53 crc kubenswrapper[4717]: I0308 05:31:53.908646 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.087612 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.147953 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.224045 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.349464 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.446891 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.475313 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.511445 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.745894 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 05:31:54 crc kubenswrapper[4717]: I0308 05:31:54.968148 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.053334 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.134168 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.134352 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.155768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.262529 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.328953 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.385351 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.432420 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.490427 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.537919 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.619267 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.619287 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.660174 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.740502 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.841535 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.876468 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.913801 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 05:31:55 crc kubenswrapper[4717]: I0308 05:31:55.944210 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.028035 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.036772 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.101526 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.131717 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.259820 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.269753 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.297639 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.340561 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.405273 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.412144 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.530779 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.550596 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.586426 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.617790 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.626393 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.632433 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 05:31:56 crc kubenswrapper[4717]: I0308 05:31:56.648779 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.119782 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.144566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.228546 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.273056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.318635 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.378044 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.393204 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.441466 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.576634 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.606436 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.641488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.649142 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.659979 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.698837 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.799633 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.804840 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.907213 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 05:31:57 crc kubenswrapper[4717]: I0308 05:31:57.963475 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.028538 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.105580 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.121911 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.145304 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.164112 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.248798 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.283910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.306202 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.345218 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.381077 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.436747 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.438178 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podStartSLOduration=41.438141682 podStartE2EDuration="41.438141682s" podCreationTimestamp="2026-03-08 05:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:42.076036136 +0000 UTC m=+328.993684990" watchObservedRunningTime="2026-03-08 05:31:58.438141682 +0000 UTC m=+345.355790566" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.442299 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.44228619 podStartE2EDuration="39.44228619s" podCreationTimestamp="2026-03-08 05:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:42.039787184 +0000 UTC m=+328.957436058" watchObservedRunningTime="2026-03-08 05:31:58.44228619 +0000 UTC m=+345.359935064" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.444609 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79ff68f84c-kj8xm" podStartSLOduration=41.444594911 podStartE2EDuration="41.444594911s" podCreationTimestamp="2026-03-08 05:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:42.059465701 +0000 UTC m=+328.977114555" watchObservedRunningTime="2026-03-08 05:31:58.444594911 +0000 UTC m=+345.362243795" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.445894 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.445971 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.453865 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.481641 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.481617533 podStartE2EDuration="16.481617533s" podCreationTimestamp="2026-03-08 05:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:31:58.474794384 +0000 UTC m=+345.392443258" watchObservedRunningTime="2026-03-08 05:31:58.481617533 +0000 UTC m=+345.399266417" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.495995 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.501363 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.527498 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.532559 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.546415 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.572321 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.710293 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.759972 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.778963 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.938724 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 05:31:58 crc kubenswrapper[4717]: I0308 05:31:58.977890 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.021963 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.055922 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.182507 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.210033 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.304680 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.323995 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.389327 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.396847 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.396946 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.397524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.399409 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"28fee218042f2df9f376acbd408b9ee473809f802c4b6bd73eedfa49b59537c8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.399675 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://28fee218042f2df9f376acbd408b9ee473809f802c4b6bd73eedfa49b59537c8" gracePeriod=30 Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.529386 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.624893 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.677836 4717 patch_prober.go:28] interesting pod/route-controller-manager-7d6d57bb45-rxqz2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.677956 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" podUID="89c4be73-2f2f-4703-b556-2a51a7962f81" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.690462 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.712814 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.786488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.793935 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.842544 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.969384 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 05:31:59 crc kubenswrapper[4717]: I0308 05:31:59.998260 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.252078 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.276175 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.331519 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.342019 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.380345 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.454248 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.489250 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.491186 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.546502 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.604348 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.671353 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.708459 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.853279 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.854375 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.920844 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.957494 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 05:32:00 crc kubenswrapper[4717]: I0308 05:32:00.983394 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.035896 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.150747 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.233357 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.325848 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.416309 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.470029 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.569318 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.732133 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.816331 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.864648 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.873936 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.917518 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 05:32:01 crc kubenswrapper[4717]: I0308 05:32:01.984220 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.030526 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.121600 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.151798 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.205641 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.261035 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.276937 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.307406 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.352816 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.392653 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.441228 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.455619 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.461173 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.491022 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.503168 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.511377 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.524917 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.585000 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.603576 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.603844 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.637998 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.640719 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.714678 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.770498 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.798315 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.858714 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.863393 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.864841 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.891251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 05:32:02 crc kubenswrapper[4717]: I0308 05:32:02.956831 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.027709 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.039998 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.052109 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.069506 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.076595 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.096852 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.149119 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.247169 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.370185 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.432538 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.438169 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.489595 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.532711 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.552986 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.808435 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.842204 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.892379 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.902495 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.912768 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.940827 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 05:32:03 crc kubenswrapper[4717]: I0308 05:32:03.950001 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.033675 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.088455 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.142609 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.205889 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.322716 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.358025 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.392741 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.418368 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.481312 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.598499 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.631464 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.645529 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.659539 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.739610 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.755039 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.758465 4717 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.758930 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1376032517a22727c50ed3c89724b142db50e07138392c534e6b54f4b079198b" gracePeriod=5 Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.766665 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 05:32:04 crc kubenswrapper[4717]: I0308 05:32:04.953734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.052823 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.187832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.206248 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.234743 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.269252 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.277635 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.314548 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.429963 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.524631 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.597936 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.647906 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.710811 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.799959 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.815051 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.815178 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.836432 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.837140 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.880167 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.903607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.969831 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.975598 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 05:32:05 crc kubenswrapper[4717]: I0308 05:32:05.984260 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.017600 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.032566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.104722 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.178268 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.188099 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.259076 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.273023 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.307624 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.381298 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.666135 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.877940 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 05:32:06 crc kubenswrapper[4717]: I0308 05:32:06.991030 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.029014 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.058165 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.329931 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.365110 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.404884 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.427230 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.494101 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 05:32:07 crc kubenswrapper[4717]: I0308 05:32:07.830591 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 05:32:08 crc kubenswrapper[4717]: I0308 05:32:08.193580 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 05:32:08 crc kubenswrapper[4717]: I0308 05:32:08.230239 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 05:32:08 crc kubenswrapper[4717]: I0308 05:32:08.275604 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 05:32:08 crc kubenswrapper[4717]: I0308 05:32:08.631674 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 05:32:08 crc kubenswrapper[4717]: I0308 05:32:08.684565 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d6d57bb45-rxqz2" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.291144 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.291741 4717 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1376032517a22727c50ed3c89724b142db50e07138392c534e6b54f4b079198b" exitCode=137 Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.381945 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.382083 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.524714 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.524838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.524906 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.524913 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.524965 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525020 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525012 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525909 4717 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525948 4717 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525973 4717 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.525992 4717 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.540744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:32:10 crc kubenswrapper[4717]: I0308 05:32:10.626855 4717 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.306840 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.307529 4717 scope.go:117] "RemoveContainer" containerID="1376032517a22727c50ed3c89724b142db50e07138392c534e6b54f4b079198b" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.307836 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.796326 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.797496 4717 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.818988 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.819341 4717 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="776e76f1-2791-429b-9a9f-1a22e3cba9da" Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.826238 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 05:32:11 crc kubenswrapper[4717]: I0308 05:32:11.826328 4717 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="776e76f1-2791-429b-9a9f-1a22e3cba9da" Mar 08 05:32:28 crc kubenswrapper[4717]: I0308 05:32:28.441454 4717 generic.go:334] "Generic (PLEG): container finished" podID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerID="e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9" exitCode=0 Mar 08 05:32:28 crc kubenswrapper[4717]: I0308 05:32:28.441623 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerDied","Data":"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9"} Mar 08 05:32:28 crc kubenswrapper[4717]: I0308 05:32:28.443204 4717 scope.go:117] "RemoveContainer" containerID="e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9" Mar 08 05:32:29 crc kubenswrapper[4717]: I0308 05:32:29.456051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerStarted","Data":"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb"} Mar 08 05:32:29 crc kubenswrapper[4717]: I0308 05:32:29.457057 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:32:29 crc kubenswrapper[4717]: I0308 05:32:29.461236 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.474373 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.479344 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.480438 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.480518 4717 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="28fee218042f2df9f376acbd408b9ee473809f802c4b6bd73eedfa49b59537c8" exitCode=137 Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.480900 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"28fee218042f2df9f376acbd408b9ee473809f802c4b6bd73eedfa49b59537c8"} Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.481178 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9146242987b8f89fbe8d6b3e22257d3b701a249951561af7961c95b78569fd2d"} Mar 08 05:32:30 crc kubenswrapper[4717]: I0308 05:32:30.481464 4717 scope.go:117] "RemoveContainer" containerID="852854b168e3cb21d0e02de92cf04551f8de154f4cd578aee64a8951a9dd2391" Mar 08 05:32:31 crc kubenswrapper[4717]: I0308 05:32:31.494977 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 08 05:32:31 crc kubenswrapper[4717]: I0308 05:32:31.500035 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 08 05:32:38 crc kubenswrapper[4717]: I0308 05:32:38.418855 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:32:39 crc kubenswrapper[4717]: I0308 05:32:39.396867 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:32:39 crc kubenswrapper[4717]: I0308 05:32:39.407565 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:32:42 crc kubenswrapper[4717]: I0308 05:32:42.881013 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 05:32:43 crc kubenswrapper[4717]: I0308 05:32:43.427415 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 05:32:48 crc kubenswrapper[4717]: I0308 05:32:48.426140 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.686522 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549132-ffgrj"] Mar 08 05:32:50 crc kubenswrapper[4717]: E0308 05:32:50.687629 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" containerName="installer" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.687729 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" containerName="installer" Mar 08 05:32:50 crc kubenswrapper[4717]: E0308 05:32:50.687805 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.687869 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.688021 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.688096 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b0fb22-162d-4b6b-a7f5-3fbeb4c71c84" containerName="installer" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.688577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.697518 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.700023 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.700596 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.716195 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549132-ffgrj"] Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.791876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqw8\" (UniqueName: \"kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8\") pod \"auto-csr-approver-29549132-ffgrj\" (UID: \"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8\") " pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.893905 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqw8\" (UniqueName: \"kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8\") pod \"auto-csr-approver-29549132-ffgrj\" (UID: \"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8\") " pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:50 crc kubenswrapper[4717]: I0308 05:32:50.923572 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqw8\" (UniqueName: \"kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8\") pod \"auto-csr-approver-29549132-ffgrj\" (UID: \"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8\") " pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:51 crc kubenswrapper[4717]: I0308 05:32:51.005037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:51 crc kubenswrapper[4717]: I0308 05:32:51.430748 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549132-ffgrj"] Mar 08 05:32:51 crc kubenswrapper[4717]: I0308 05:32:51.652830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" event={"ID":"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8","Type":"ContainerStarted","Data":"bba743bf5c4c2c3155e59421bdcbe66e782cef1abd84cba6e43101a41d5f201e"} Mar 08 05:32:53 crc kubenswrapper[4717]: I0308 05:32:53.668594 4717 generic.go:334] "Generic (PLEG): container finished" podID="14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" containerID="d4253d896c970eacbaf4aa5f9f8157db8345f281a092d32e4503e112958f6b0f" exitCode=0 Mar 08 05:32:53 crc kubenswrapper[4717]: I0308 05:32:53.668718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" event={"ID":"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8","Type":"ContainerDied","Data":"d4253d896c970eacbaf4aa5f9f8157db8345f281a092d32e4503e112958f6b0f"} Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.013962 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.063479 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmqw8\" (UniqueName: \"kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8\") pod \"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8\" (UID: \"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8\") " Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.074674 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8" (OuterVolumeSpecName: "kube-api-access-vmqw8") pod "14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" (UID: "14b6ad0f-a029-40ca-9e23-a1794b1fb3a8"). InnerVolumeSpecName "kube-api-access-vmqw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.166656 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmqw8\" (UniqueName: \"kubernetes.io/projected/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8-kube-api-access-vmqw8\") on node \"crc\" DevicePath \"\"" Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.696889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" event={"ID":"14b6ad0f-a029-40ca-9e23-a1794b1fb3a8","Type":"ContainerDied","Data":"bba743bf5c4c2c3155e59421bdcbe66e782cef1abd84cba6e43101a41d5f201e"} Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.697555 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba743bf5c4c2c3155e59421bdcbe66e782cef1abd84cba6e43101a41d5f201e" Mar 08 05:32:55 crc kubenswrapper[4717]: I0308 05:32:55.696978 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549132-ffgrj" Mar 08 05:33:15 crc kubenswrapper[4717]: I0308 05:33:15.943394 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-89jks"] Mar 08 05:33:15 crc kubenswrapper[4717]: E0308 05:33:15.944508 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" containerName="oc" Mar 08 05:33:15 crc kubenswrapper[4717]: I0308 05:33:15.944526 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" containerName="oc" Mar 08 05:33:15 crc kubenswrapper[4717]: I0308 05:33:15.944677 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" containerName="oc" Mar 08 05:33:15 crc kubenswrapper[4717]: I0308 05:33:15.945277 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:15 crc kubenswrapper[4717]: I0308 05:33:15.960487 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-89jks"] Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-bound-sa-token\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125145 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-tls\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-certificates\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125510 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w66t\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-kube-api-access-5w66t\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125815 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.125926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-trusted-ca\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.162721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-trusted-ca\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-bound-sa-token\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227735 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-tls\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-certificates\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.227908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w66t\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-kube-api-access-5w66t\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.228629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.230256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-trusted-ca\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.230515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-certificates\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.243122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.243173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-registry-tls\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.252023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-bound-sa-token\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.261266 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w66t\" (UniqueName: \"kubernetes.io/projected/0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1-kube-api-access-5w66t\") pod \"image-registry-66df7c8f76-89jks\" (UID: \"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1\") " pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.264045 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.556598 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-89jks"] Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.876783 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" event={"ID":"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1","Type":"ContainerStarted","Data":"f2b2551e88dfb1450a74cd78d6d8286a749923a5c8479b871827961d6ad3472f"} Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.877200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" event={"ID":"0956b7e7-ed2c-4e5f-9002-a3ed42bfd6d1","Type":"ContainerStarted","Data":"0ed21ab298f1762c035c073b8bd4a2ffe65935db8027f0ba05985a5f7690484a"} Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.877224 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:16 crc kubenswrapper[4717]: I0308 05:33:16.899573 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" podStartSLOduration=1.899536353 podStartE2EDuration="1.899536353s" podCreationTimestamp="2026-03-08 05:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:33:16.89586873 +0000 UTC m=+423.813517594" watchObservedRunningTime="2026-03-08 05:33:16.899536353 +0000 UTC m=+423.817185237" Mar 08 05:33:34 crc kubenswrapper[4717]: I0308 05:33:34.120197 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:33:34 crc kubenswrapper[4717]: I0308 05:33:34.121323 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:33:36 crc kubenswrapper[4717]: I0308 05:33:36.276266 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-89jks" Mar 08 05:33:36 crc kubenswrapper[4717]: I0308 05:33:36.355058 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.933723 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.938980 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.938974 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lmkvf" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="registry-server" containerID="cri-o://b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf" gracePeriod=30 Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.939290 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hgbcl" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="registry-server" containerID="cri-o://d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d" gracePeriod=30 Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.967972 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.968271 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" containerID="cri-o://6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb" gracePeriod=30 Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.985580 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.986007 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x7t8t" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="registry-server" containerID="cri-o://cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8" gracePeriod=30 Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.998033 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:33:40 crc kubenswrapper[4717]: I0308 05:33:40.998379 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n69hm" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="registry-server" containerID="cri-o://20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e" gracePeriod=30 Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.013838 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zhjjq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": read tcp 10.217.0.2:47526->10.217.0.33:8080: read: connection reset by peer" start-of-body= Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.013956 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": read tcp 10.217.0.2:47526->10.217.0.33:8080: read: connection reset by peer" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.018043 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmqxx"] Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.019063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.025625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmqxx"] Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.113101 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.113231 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.113262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5rcv\" (UniqueName: \"kubernetes.io/projected/61147cf3-b98d-4c9f-a053-2d818468c5e0-kube-api-access-k5rcv\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.214895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.215480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.215507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5rcv\" (UniqueName: \"kubernetes.io/projected/61147cf3-b98d-4c9f-a053-2d818468c5e0-kube-api-access-k5rcv\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.216312 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.232498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5rcv\" (UniqueName: \"kubernetes.io/projected/61147cf3-b98d-4c9f-a053-2d818468c5e0-kube-api-access-k5rcv\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.234957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61147cf3-b98d-4c9f-a053-2d818468c5e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmqxx\" (UID: \"61147cf3-b98d-4c9f-a053-2d818468c5e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.388416 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.393515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.438814 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.441375 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.447140 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.509342 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.524435 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities\") pod \"d612266d-387c-4561-a50f-02cd3cced887\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.524791 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content\") pod \"d612266d-387c-4561-a50f-02cd3cced887\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.524949 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4gn\" (UniqueName: \"kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn\") pod \"d612266d-387c-4561-a50f-02cd3cced887\" (UID: \"d612266d-387c-4561-a50f-02cd3cced887\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.527710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities" (OuterVolumeSpecName: "utilities") pod "d612266d-387c-4561-a50f-02cd3cced887" (UID: "d612266d-387c-4561-a50f-02cd3cced887"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.542893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn" (OuterVolumeSpecName: "kube-api-access-qg4gn") pod "d612266d-387c-4561-a50f-02cd3cced887" (UID: "d612266d-387c-4561-a50f-02cd3cced887"). InnerVolumeSpecName "kube-api-access-qg4gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.589842 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d612266d-387c-4561-a50f-02cd3cced887" (UID: "d612266d-387c-4561-a50f-02cd3cced887"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626080 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities\") pod \"06f4ab9f-48eb-410c-8915-c47c5cff1650\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626156 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content\") pod \"2ce686db-32d9-41b7-80fa-124e094dc4e8\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca\") pod \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626268 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities\") pod \"2ce686db-32d9-41b7-80fa-124e094dc4e8\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626723 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content\") pod \"06f4ab9f-48eb-410c-8915-c47c5cff1650\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content\") pod \"5961d211-7900-41ef-9915-d935e9cec42a\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626825 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl2w7\" (UniqueName: \"kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7\") pod \"2ce686db-32d9-41b7-80fa-124e094dc4e8\" (UID: \"2ce686db-32d9-41b7-80fa-124e094dc4e8\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626858 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics\") pod \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626885 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llcwp\" (UniqueName: \"kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp\") pod \"06f4ab9f-48eb-410c-8915-c47c5cff1650\" (UID: \"06f4ab9f-48eb-410c-8915-c47c5cff1650\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626874 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities" (OuterVolumeSpecName: "utilities") pod "06f4ab9f-48eb-410c-8915-c47c5cff1650" (UID: "06f4ab9f-48eb-410c-8915-c47c5cff1650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.626914 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhqmq\" (UniqueName: \"kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq\") pod \"5961d211-7900-41ef-9915-d935e9cec42a\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.627073 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4qg5\" (UniqueName: \"kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5\") pod \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\" (UID: \"22aa82e5-83a2-4046-8d11-89e9d34e00e1\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.627112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities\") pod \"5961d211-7900-41ef-9915-d935e9cec42a\" (UID: \"5961d211-7900-41ef-9915-d935e9cec42a\") " Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629587 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities" (OuterVolumeSpecName: "utilities") pod "5961d211-7900-41ef-9915-d935e9cec42a" (UID: "5961d211-7900-41ef-9915-d935e9cec42a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629654 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4gn\" (UniqueName: \"kubernetes.io/projected/d612266d-387c-4561-a50f-02cd3cced887-kube-api-access-qg4gn\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629693 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629710 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d612266d-387c-4561-a50f-02cd3cced887-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629723 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.629767 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "22aa82e5-83a2-4046-8d11-89e9d34e00e1" (UID: "22aa82e5-83a2-4046-8d11-89e9d34e00e1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.630893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities" (OuterVolumeSpecName: "utilities") pod "2ce686db-32d9-41b7-80fa-124e094dc4e8" (UID: "2ce686db-32d9-41b7-80fa-124e094dc4e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.631538 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7" (OuterVolumeSpecName: "kube-api-access-hl2w7") pod "2ce686db-32d9-41b7-80fa-124e094dc4e8" (UID: "2ce686db-32d9-41b7-80fa-124e094dc4e8"). InnerVolumeSpecName "kube-api-access-hl2w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.631653 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "22aa82e5-83a2-4046-8d11-89e9d34e00e1" (UID: "22aa82e5-83a2-4046-8d11-89e9d34e00e1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.631932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp" (OuterVolumeSpecName: "kube-api-access-llcwp") pod "06f4ab9f-48eb-410c-8915-c47c5cff1650" (UID: "06f4ab9f-48eb-410c-8915-c47c5cff1650"). InnerVolumeSpecName "kube-api-access-llcwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.632805 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5" (OuterVolumeSpecName: "kube-api-access-q4qg5") pod "22aa82e5-83a2-4046-8d11-89e9d34e00e1" (UID: "22aa82e5-83a2-4046-8d11-89e9d34e00e1"). InnerVolumeSpecName "kube-api-access-q4qg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.639959 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq" (OuterVolumeSpecName: "kube-api-access-jhqmq") pod "5961d211-7900-41ef-9915-d935e9cec42a" (UID: "5961d211-7900-41ef-9915-d935e9cec42a"). InnerVolumeSpecName "kube-api-access-jhqmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.660838 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ce686db-32d9-41b7-80fa-124e094dc4e8" (UID: "2ce686db-32d9-41b7-80fa-124e094dc4e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730803 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl2w7\" (UniqueName: \"kubernetes.io/projected/2ce686db-32d9-41b7-80fa-124e094dc4e8-kube-api-access-hl2w7\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730834 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730846 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llcwp\" (UniqueName: \"kubernetes.io/projected/06f4ab9f-48eb-410c-8915-c47c5cff1650-kube-api-access-llcwp\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730858 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhqmq\" (UniqueName: \"kubernetes.io/projected/5961d211-7900-41ef-9915-d935e9cec42a-kube-api-access-jhqmq\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730868 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4qg5\" (UniqueName: \"kubernetes.io/projected/22aa82e5-83a2-4046-8d11-89e9d34e00e1-kube-api-access-q4qg5\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730879 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730893 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730904 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22aa82e5-83a2-4046-8d11-89e9d34e00e1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.730914 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce686db-32d9-41b7-80fa-124e094dc4e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.737404 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5961d211-7900-41ef-9915-d935e9cec42a" (UID: "5961d211-7900-41ef-9915-d935e9cec42a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.772936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06f4ab9f-48eb-410c-8915-c47c5cff1650" (UID: "06f4ab9f-48eb-410c-8915-c47c5cff1650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.832831 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06f4ab9f-48eb-410c-8915-c47c5cff1650-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.832915 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5961d211-7900-41ef-9915-d935e9cec42a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:33:41 crc kubenswrapper[4717]: I0308 05:33:41.869112 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmqxx"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.077810 4717 generic.go:334] "Generic (PLEG): container finished" podID="d612266d-387c-4561-a50f-02cd3cced887" containerID="d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d" exitCode=0 Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.077877 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hgbcl" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.077882 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerDied","Data":"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.077950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hgbcl" event={"ID":"d612266d-387c-4561-a50f-02cd3cced887","Type":"ContainerDied","Data":"4ed1ed426811b36bacde11eef678ee3e15e605e3a7160e65441ee43448aaa5f8"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.077970 4717 scope.go:117] "RemoveContainer" containerID="d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.082429 4717 generic.go:334] "Generic (PLEG): container finished" podID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerID="20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e" exitCode=0 Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.082513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerDied","Data":"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.082557 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n69hm" event={"ID":"06f4ab9f-48eb-410c-8915-c47c5cff1650","Type":"ContainerDied","Data":"642ca03781ed5fa615f4e743f37ba682ae4a27dad337b1ad959d55a98107dfb3"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.082929 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n69hm" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.084779 4717 generic.go:334] "Generic (PLEG): container finished" podID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerID="6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb" exitCode=0 Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.084822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerDied","Data":"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.084842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" event={"ID":"22aa82e5-83a2-4046-8d11-89e9d34e00e1","Type":"ContainerDied","Data":"25c89bced4be891da572e03247861075e21bd75153e8f4ee70825b5f5930cefb"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.084908 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zhjjq" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.086860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" event={"ID":"61147cf3-b98d-4c9f-a053-2d818468c5e0","Type":"ContainerStarted","Data":"85f820670ab093d0e1e5176e967c07eab481ac2b0a289e915ecb2ceb9d2fdcc1"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.086887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" event={"ID":"61147cf3-b98d-4c9f-a053-2d818468c5e0","Type":"ContainerStarted","Data":"e52c2e4a340d9b59ed07776a1fcb71371083d51ab356b9f45c2f758ab4f1f4bc"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.087418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.090572 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cmqxx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.73:8080/healthz\": dial tcp 10.217.0.73:8080: connect: connection refused" start-of-body= Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.090649 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" podUID="61147cf3-b98d-4c9f-a053-2d818468c5e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.73:8080/healthz\": dial tcp 10.217.0.73:8080: connect: connection refused" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.090888 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7t8t" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.090804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerDied","Data":"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.090779 4717 generic.go:334] "Generic (PLEG): container finished" podID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerID="cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8" exitCode=0 Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.091206 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7t8t" event={"ID":"2ce686db-32d9-41b7-80fa-124e094dc4e8","Type":"ContainerDied","Data":"1886c95cd0a6853721a991d4ebfb850adc71bfa0062bd6590a7da99292c838f2"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.098744 4717 generic.go:334] "Generic (PLEG): container finished" podID="5961d211-7900-41ef-9915-d935e9cec42a" containerID="b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf" exitCode=0 Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.098785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerDied","Data":"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.098866 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lmkvf" event={"ID":"5961d211-7900-41ef-9915-d935e9cec42a","Type":"ContainerDied","Data":"83bc79adce75aeb898d839d8b796b43450c206a879b1d43cd6187f05ab1e004c"} Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.098970 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lmkvf" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.133379 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" podStartSLOduration=2.133349858 podStartE2EDuration="2.133349858s" podCreationTimestamp="2026-03-08 05:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:33:42.116818616 +0000 UTC m=+449.034467460" watchObservedRunningTime="2026-03-08 05:33:42.133349858 +0000 UTC m=+449.050998712" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.148119 4717 scope.go:117] "RemoveContainer" containerID="df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.164077 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.167618 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n69hm"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.178023 4717 scope.go:117] "RemoveContainer" containerID="c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.187967 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.197531 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lmkvf"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.202963 4717 scope.go:117] "RemoveContainer" containerID="d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.203748 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d\": container with ID starting with d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d not found: ID does not exist" containerID="d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.203815 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d"} err="failed to get container status \"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d\": rpc error: code = NotFound desc = could not find container \"d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d\": container with ID starting with d5f5d2a446b0b6c36936287b72d07db3bd8b96fab9e7898bcaf59c28ac66ee1d not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.203855 4717 scope.go:117] "RemoveContainer" containerID="df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.204356 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71\": container with ID starting with df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71 not found: ID does not exist" containerID="df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.204393 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71"} err="failed to get container status \"df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71\": rpc error: code = NotFound desc = could not find container \"df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71\": container with ID starting with df9e4efeb6a20103561b26fe4b157d67e07b8df6f09f36563246554cd79d0e71 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.204414 4717 scope.go:117] "RemoveContainer" containerID="c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.204502 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.205025 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e\": container with ID starting with c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e not found: ID does not exist" containerID="c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.205069 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e"} err="failed to get container status \"c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e\": rpc error: code = NotFound desc = could not find container \"c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e\": container with ID starting with c8b7f061c148a7490d3d3e115bc384845ee8fe938b59f24fee8bc04e7520505e not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.205100 4717 scope.go:117] "RemoveContainer" containerID="20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.209553 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zhjjq"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.214978 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.223975 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hgbcl"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.227529 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.230128 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7t8t"] Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.232123 4717 scope.go:117] "RemoveContainer" containerID="1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.253308 4717 scope.go:117] "RemoveContainer" containerID="efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.276586 4717 scope.go:117] "RemoveContainer" containerID="20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.279878 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e\": container with ID starting with 20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e not found: ID does not exist" containerID="20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.279943 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e"} err="failed to get container status \"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e\": rpc error: code = NotFound desc = could not find container \"20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e\": container with ID starting with 20b7a623d6540caf1a0fe8fa0c5ba60f4270ec079a618cc0ff94813375cfdb5e not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.279980 4717 scope.go:117] "RemoveContainer" containerID="1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.280411 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55\": container with ID starting with 1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55 not found: ID does not exist" containerID="1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.280441 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55"} err="failed to get container status \"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55\": rpc error: code = NotFound desc = could not find container \"1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55\": container with ID starting with 1c6137ba166a97e776089bc10bee388962c00b6f1c27d63047fb61bc13b6ec55 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.280456 4717 scope.go:117] "RemoveContainer" containerID="efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.280758 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9\": container with ID starting with efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9 not found: ID does not exist" containerID="efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.280805 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9"} err="failed to get container status \"efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9\": rpc error: code = NotFound desc = could not find container \"efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9\": container with ID starting with efa719f2a0ce0b3149345af85e5154fb955feccbd46ccce052ad925b28b955b9 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.280835 4717 scope.go:117] "RemoveContainer" containerID="6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.294648 4717 scope.go:117] "RemoveContainer" containerID="e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.309870 4717 scope.go:117] "RemoveContainer" containerID="6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.310335 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb\": container with ID starting with 6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb not found: ID does not exist" containerID="6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.310382 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb"} err="failed to get container status \"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb\": rpc error: code = NotFound desc = could not find container \"6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb\": container with ID starting with 6cf0f815cceda66a9802f1c449d85dad75937a4756940dac8a63ab034db34fbb not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.310413 4717 scope.go:117] "RemoveContainer" containerID="e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.310795 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9\": container with ID starting with e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9 not found: ID does not exist" containerID="e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.310849 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9"} err="failed to get container status \"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9\": rpc error: code = NotFound desc = could not find container \"e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9\": container with ID starting with e9e683e69120a102657969b6b2012f9daca9921514d941b1f6542227dd16a2f9 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.310880 4717 scope.go:117] "RemoveContainer" containerID="cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.332966 4717 scope.go:117] "RemoveContainer" containerID="75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.349757 4717 scope.go:117] "RemoveContainer" containerID="5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.378502 4717 scope.go:117] "RemoveContainer" containerID="cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.381553 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8\": container with ID starting with cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8 not found: ID does not exist" containerID="cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.382184 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8"} err="failed to get container status \"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8\": rpc error: code = NotFound desc = could not find container \"cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8\": container with ID starting with cb93dc3ae8d57dc1bab8538b26718c6d0ede09572093ee4e19c20a49465d97c8 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.382597 4717 scope.go:117] "RemoveContainer" containerID="75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.383852 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6\": container with ID starting with 75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6 not found: ID does not exist" containerID="75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.383913 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6"} err="failed to get container status \"75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6\": rpc error: code = NotFound desc = could not find container \"75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6\": container with ID starting with 75f5b80df1aa68ea29788b82837bf4cd4ea5232358fe1fa5bb75776e494836c6 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.383962 4717 scope.go:117] "RemoveContainer" containerID="5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.384801 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283\": container with ID starting with 5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283 not found: ID does not exist" containerID="5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.385067 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283"} err="failed to get container status \"5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283\": rpc error: code = NotFound desc = could not find container \"5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283\": container with ID starting with 5f806a6852bef06ac8247ea7c0332411351802f1a1cabebc8efaf0c9d52be283 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.385303 4717 scope.go:117] "RemoveContainer" containerID="b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.401453 4717 scope.go:117] "RemoveContainer" containerID="adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.425341 4717 scope.go:117] "RemoveContainer" containerID="1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.443342 4717 scope.go:117] "RemoveContainer" containerID="b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.443910 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf\": container with ID starting with b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf not found: ID does not exist" containerID="b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.443986 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf"} err="failed to get container status \"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf\": rpc error: code = NotFound desc = could not find container \"b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf\": container with ID starting with b5e8e8dc1359fd09ed6685905b1ffde07701d4b6c705e91504aaaac4b4fc37bf not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.444032 4717 scope.go:117] "RemoveContainer" containerID="adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.444664 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130\": container with ID starting with adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130 not found: ID does not exist" containerID="adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.444714 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130"} err="failed to get container status \"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130\": rpc error: code = NotFound desc = could not find container \"adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130\": container with ID starting with adb7d8d9f516ba6d22cc3ab0f98c94c895287b83b3332c7d9e883575d0b2f130 not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.444741 4717 scope.go:117] "RemoveContainer" containerID="1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.445158 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc\": container with ID starting with 1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc not found: ID does not exist" containerID="1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.445289 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc"} err="failed to get container status \"1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc\": rpc error: code = NotFound desc = could not find container \"1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc\": container with ID starting with 1804f3d70155e04b62cc9a947808664a6a5d7da476395e1508d68601c3c6d5cc not found: ID does not exist" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.946970 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5skl"] Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947333 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947355 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947379 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947393 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947414 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947431 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947446 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947459 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947474 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947487 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947513 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947526 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947541 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947554 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947572 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947585 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947608 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947620 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947638 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947650 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947668 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947680 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947719 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947732 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="extract-utilities" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947746 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947757 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: E0308 05:33:42.947777 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947791 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="extract-content" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947947 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947973 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.947990 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d612266d-387c-4561-a50f-02cd3cced887" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.948009 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.948024 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5961d211-7900-41ef-9915-d935e9cec42a" containerName="registry-server" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.948315 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" containerName="marketplace-operator" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.949358 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.952848 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 05:33:42 crc kubenswrapper[4717]: I0308 05:33:42.967181 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5skl"] Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.052660 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46pws\" (UniqueName: \"kubernetes.io/projected/ff534094-b1ae-4777-955f-322d8f2bfc65-kube-api-access-46pws\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.052743 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-catalog-content\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.052882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-utilities\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.119093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cmqxx" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.155450 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46pws\" (UniqueName: \"kubernetes.io/projected/ff534094-b1ae-4777-955f-322d8f2bfc65-kube-api-access-46pws\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.155945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-catalog-content\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.156198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-utilities\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.157281 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-catalog-content\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.157535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff534094-b1ae-4777-955f-322d8f2bfc65-utilities\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.195864 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46pws\" (UniqueName: \"kubernetes.io/projected/ff534094-b1ae-4777-955f-322d8f2bfc65-kube-api-access-46pws\") pod \"certified-operators-m5skl\" (UID: \"ff534094-b1ae-4777-955f-322d8f2bfc65\") " pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.281671 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.542241 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bvmlr"] Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.544292 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.548311 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.565363 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvmlr"] Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.596596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5skl"] Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.665758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgn2\" (UniqueName: \"kubernetes.io/projected/13ec80f4-5952-4e71-8aaa-18643bdfae3d-kube-api-access-rzgn2\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.666173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-utilities\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.666214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-catalog-content\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.767537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgn2\" (UniqueName: \"kubernetes.io/projected/13ec80f4-5952-4e71-8aaa-18643bdfae3d-kube-api-access-rzgn2\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.767594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-utilities\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.767617 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-catalog-content\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.768083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-catalog-content\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.768228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13ec80f4-5952-4e71-8aaa-18643bdfae3d-utilities\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.789810 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f4ab9f-48eb-410c-8915-c47c5cff1650" path="/var/lib/kubelet/pods/06f4ab9f-48eb-410c-8915-c47c5cff1650/volumes" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.790994 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22aa82e5-83a2-4046-8d11-89e9d34e00e1" path="/var/lib/kubelet/pods/22aa82e5-83a2-4046-8d11-89e9d34e00e1/volumes" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.791659 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce686db-32d9-41b7-80fa-124e094dc4e8" path="/var/lib/kubelet/pods/2ce686db-32d9-41b7-80fa-124e094dc4e8/volumes" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.791666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgn2\" (UniqueName: \"kubernetes.io/projected/13ec80f4-5952-4e71-8aaa-18643bdfae3d-kube-api-access-rzgn2\") pod \"redhat-marketplace-bvmlr\" (UID: \"13ec80f4-5952-4e71-8aaa-18643bdfae3d\") " pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.792981 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5961d211-7900-41ef-9915-d935e9cec42a" path="/var/lib/kubelet/pods/5961d211-7900-41ef-9915-d935e9cec42a/volumes" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.793816 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d612266d-387c-4561-a50f-02cd3cced887" path="/var/lib/kubelet/pods/d612266d-387c-4561-a50f-02cd3cced887/volumes" Mar 08 05:33:43 crc kubenswrapper[4717]: I0308 05:33:43.868645 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:44 crc kubenswrapper[4717]: I0308 05:33:44.127737 4717 generic.go:334] "Generic (PLEG): container finished" podID="ff534094-b1ae-4777-955f-322d8f2bfc65" containerID="aa7dc022936cc7a9a2768289fb279ef64275bc72eb0c9bd6aa4ce685619a87fa" exitCode=0 Mar 08 05:33:44 crc kubenswrapper[4717]: I0308 05:33:44.127868 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5skl" event={"ID":"ff534094-b1ae-4777-955f-322d8f2bfc65","Type":"ContainerDied","Data":"aa7dc022936cc7a9a2768289fb279ef64275bc72eb0c9bd6aa4ce685619a87fa"} Mar 08 05:33:44 crc kubenswrapper[4717]: I0308 05:33:44.127954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5skl" event={"ID":"ff534094-b1ae-4777-955f-322d8f2bfc65","Type":"ContainerStarted","Data":"272933def48fb3c6322e3b1922cc99dbf7280e611cbf9c4e1446fe40fdaf9640"} Mar 08 05:33:44 crc kubenswrapper[4717]: I0308 05:33:44.324469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvmlr"] Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.135466 4717 generic.go:334] "Generic (PLEG): container finished" podID="13ec80f4-5952-4e71-8aaa-18643bdfae3d" containerID="c02dc01520de110c2987e5b73794b9c32895cac6c57299663e4645307ae4d2b4" exitCode=0 Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.135533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvmlr" event={"ID":"13ec80f4-5952-4e71-8aaa-18643bdfae3d","Type":"ContainerDied","Data":"c02dc01520de110c2987e5b73794b9c32895cac6c57299663e4645307ae4d2b4"} Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.135579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvmlr" event={"ID":"13ec80f4-5952-4e71-8aaa-18643bdfae3d","Type":"ContainerStarted","Data":"7c6661debb3aed509e8c861b2ef85735679abd9f9df3afa69058eb0cc1842bb7"} Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.345068 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5chdv"] Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.347064 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.351790 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.362515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5chdv"] Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.463249 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-utilities\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.463421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zzf\" (UniqueName: \"kubernetes.io/projected/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-kube-api-access-j2zzf\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.463549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-catalog-content\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.565702 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zzf\" (UniqueName: \"kubernetes.io/projected/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-kube-api-access-j2zzf\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.565884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-catalog-content\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.565944 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-utilities\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.566801 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-catalog-content\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.566927 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-utilities\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.599271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zzf\" (UniqueName: \"kubernetes.io/projected/877ad6e6-1569-4e9c-a1fb-a2226718fa2d-kube-api-access-j2zzf\") pod \"redhat-operators-5chdv\" (UID: \"877ad6e6-1569-4e9c-a1fb-a2226718fa2d\") " pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.678786 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.952845 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.954445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.957832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.960803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 05:33:45 crc kubenswrapper[4717]: I0308 05:33:45.983726 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5chdv"] Mar 08 05:33:45 crc kubenswrapper[4717]: W0308 05:33:45.991009 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod877ad6e6_1569_4e9c_a1fb_a2226718fa2d.slice/crio-67ef01c798a95f1e8967fd9e43303e405504051d9df7a8f65ea81a345758640e WatchSource:0}: Error finding container 67ef01c798a95f1e8967fd9e43303e405504051d9df7a8f65ea81a345758640e: Status 404 returned error can't find the container with id 67ef01c798a95f1e8967fd9e43303e405504051d9df7a8f65ea81a345758640e Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.077084 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbkv8\" (UniqueName: \"kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.077144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.077186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.143110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5chdv" event={"ID":"877ad6e6-1569-4e9c-a1fb-a2226718fa2d","Type":"ContainerStarted","Data":"67ef01c798a95f1e8967fd9e43303e405504051d9df7a8f65ea81a345758640e"} Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.146231 4717 generic.go:334] "Generic (PLEG): container finished" podID="ff534094-b1ae-4777-955f-322d8f2bfc65" containerID="403be7ac1fa6d420abb25ec8f6273b66f221ce886ba73a26a1c2e96b62e12264" exitCode=0 Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.146336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5skl" event={"ID":"ff534094-b1ae-4777-955f-322d8f2bfc65","Type":"ContainerDied","Data":"403be7ac1fa6d420abb25ec8f6273b66f221ce886ba73a26a1c2e96b62e12264"} Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.149721 4717 generic.go:334] "Generic (PLEG): container finished" podID="13ec80f4-5952-4e71-8aaa-18643bdfae3d" containerID="260d0281845a991504efe78849f5f5bbade9d8b5523dffc56b1b73b6a76e08f7" exitCode=0 Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.149761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvmlr" event={"ID":"13ec80f4-5952-4e71-8aaa-18643bdfae3d","Type":"ContainerDied","Data":"260d0281845a991504efe78849f5f5bbade9d8b5523dffc56b1b73b6a76e08f7"} Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.179056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.179177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbkv8\" (UniqueName: \"kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.179230 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.180346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.183802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.220606 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbkv8\" (UniqueName: \"kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8\") pod \"community-operators-gzsvv\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.370745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:46 crc kubenswrapper[4717]: I0308 05:33:46.708178 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 05:33:46 crc kubenswrapper[4717]: W0308 05:33:46.717080 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e076e50_edc7_4172_bb69_ca35340a0f0b.slice/crio-7447a01ddc2f0251fe2055220a7e475560a6daf7ea34ff7aee478753b96c772d WatchSource:0}: Error finding container 7447a01ddc2f0251fe2055220a7e475560a6daf7ea34ff7aee478753b96c772d: Status 404 returned error can't find the container with id 7447a01ddc2f0251fe2055220a7e475560a6daf7ea34ff7aee478753b96c772d Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.160186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5skl" event={"ID":"ff534094-b1ae-4777-955f-322d8f2bfc65","Type":"ContainerStarted","Data":"0e06595c8f380ba4e29fc293cfab533f9bc177711bf3dd406c0cab08a1052eb2"} Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.163919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvmlr" event={"ID":"13ec80f4-5952-4e71-8aaa-18643bdfae3d","Type":"ContainerStarted","Data":"1e1e6995f7b4772b362479cead62c6d8e680dd3302e2dd96238b819b48af81c3"} Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.168343 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerID="203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e" exitCode=0 Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.168424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerDied","Data":"203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e"} Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.168476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerStarted","Data":"7447a01ddc2f0251fe2055220a7e475560a6daf7ea34ff7aee478753b96c772d"} Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.193627 4717 generic.go:334] "Generic (PLEG): container finished" podID="877ad6e6-1569-4e9c-a1fb-a2226718fa2d" containerID="4b7b9e17914c50b774e4cb58a16a4137118bbc9e4cd11e8eb1858b04adfff829" exitCode=0 Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.193874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5chdv" event={"ID":"877ad6e6-1569-4e9c-a1fb-a2226718fa2d","Type":"ContainerDied","Data":"4b7b9e17914c50b774e4cb58a16a4137118bbc9e4cd11e8eb1858b04adfff829"} Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.195722 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5skl" podStartSLOduration=2.710693962 podStartE2EDuration="5.195674182s" podCreationTimestamp="2026-03-08 05:33:42 +0000 UTC" firstStartedPulling="2026-03-08 05:33:44.130606832 +0000 UTC m=+451.048255676" lastFinishedPulling="2026-03-08 05:33:46.615587042 +0000 UTC m=+453.533235896" observedRunningTime="2026-03-08 05:33:47.185906323 +0000 UTC m=+454.103555167" watchObservedRunningTime="2026-03-08 05:33:47.195674182 +0000 UTC m=+454.113323036" Mar 08 05:33:47 crc kubenswrapper[4717]: I0308 05:33:47.212373 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bvmlr" podStartSLOduration=2.7606118520000003 podStartE2EDuration="4.212344357s" podCreationTimestamp="2026-03-08 05:33:43 +0000 UTC" firstStartedPulling="2026-03-08 05:33:45.137256289 +0000 UTC m=+452.054905173" lastFinishedPulling="2026-03-08 05:33:46.588988824 +0000 UTC m=+453.506637678" observedRunningTime="2026-03-08 05:33:47.211967308 +0000 UTC m=+454.129616162" watchObservedRunningTime="2026-03-08 05:33:47.212344357 +0000 UTC m=+454.129993201" Mar 08 05:33:48 crc kubenswrapper[4717]: I0308 05:33:48.201944 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerID="50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617" exitCode=0 Mar 08 05:33:48 crc kubenswrapper[4717]: I0308 05:33:48.202481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerDied","Data":"50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617"} Mar 08 05:33:48 crc kubenswrapper[4717]: I0308 05:33:48.207741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5chdv" event={"ID":"877ad6e6-1569-4e9c-a1fb-a2226718fa2d","Type":"ContainerStarted","Data":"05dfe399a9ade8359b6cf21ec3cb9f93245d15256951362098c258da3fff1d81"} Mar 08 05:33:49 crc kubenswrapper[4717]: I0308 05:33:49.217993 4717 generic.go:334] "Generic (PLEG): container finished" podID="877ad6e6-1569-4e9c-a1fb-a2226718fa2d" containerID="05dfe399a9ade8359b6cf21ec3cb9f93245d15256951362098c258da3fff1d81" exitCode=0 Mar 08 05:33:49 crc kubenswrapper[4717]: I0308 05:33:49.218133 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5chdv" event={"ID":"877ad6e6-1569-4e9c-a1fb-a2226718fa2d","Type":"ContainerDied","Data":"05dfe399a9ade8359b6cf21ec3cb9f93245d15256951362098c258da3fff1d81"} Mar 08 05:33:49 crc kubenswrapper[4717]: I0308 05:33:49.224510 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerStarted","Data":"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b"} Mar 08 05:33:50 crc kubenswrapper[4717]: I0308 05:33:50.236669 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5chdv" event={"ID":"877ad6e6-1569-4e9c-a1fb-a2226718fa2d","Type":"ContainerStarted","Data":"b8f0362d4affbdf913e414b2e74de82fac180af1c17ae126e5b2731afe44f60e"} Mar 08 05:33:50 crc kubenswrapper[4717]: I0308 05:33:50.267154 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gzsvv" podStartSLOduration=3.799317696 podStartE2EDuration="5.26712053s" podCreationTimestamp="2026-03-08 05:33:45 +0000 UTC" firstStartedPulling="2026-03-08 05:33:47.171875056 +0000 UTC m=+454.089523900" lastFinishedPulling="2026-03-08 05:33:48.63967789 +0000 UTC m=+455.557326734" observedRunningTime="2026-03-08 05:33:49.276595631 +0000 UTC m=+456.194244505" watchObservedRunningTime="2026-03-08 05:33:50.26712053 +0000 UTC m=+457.184769404" Mar 08 05:33:50 crc kubenswrapper[4717]: I0308 05:33:50.274427 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5chdv" podStartSLOduration=2.819469195 podStartE2EDuration="5.27440074s" podCreationTimestamp="2026-03-08 05:33:45 +0000 UTC" firstStartedPulling="2026-03-08 05:33:47.198136495 +0000 UTC m=+454.115785339" lastFinishedPulling="2026-03-08 05:33:49.65306804 +0000 UTC m=+456.570716884" observedRunningTime="2026-03-08 05:33:50.264940386 +0000 UTC m=+457.182589270" watchObservedRunningTime="2026-03-08 05:33:50.27440074 +0000 UTC m=+457.192049614" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.282160 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.282626 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.372237 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.869578 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.870403 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:53 crc kubenswrapper[4717]: I0308 05:33:53.936352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:54 crc kubenswrapper[4717]: I0308 05:33:54.334368 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5skl" Mar 08 05:33:54 crc kubenswrapper[4717]: I0308 05:33:54.338195 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bvmlr" Mar 08 05:33:55 crc kubenswrapper[4717]: I0308 05:33:55.679811 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:55 crc kubenswrapper[4717]: I0308 05:33:55.680378 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:33:56 crc kubenswrapper[4717]: I0308 05:33:56.371747 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:56 crc kubenswrapper[4717]: I0308 05:33:56.371831 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:56 crc kubenswrapper[4717]: I0308 05:33:56.420524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:33:56 crc kubenswrapper[4717]: I0308 05:33:56.748536 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5chdv" podUID="877ad6e6-1569-4e9c-a1fb-a2226718fa2d" containerName="registry-server" probeResult="failure" output=< Mar 08 05:33:56 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:33:56 crc kubenswrapper[4717]: > Mar 08 05:33:57 crc kubenswrapper[4717]: I0308 05:33:57.356106 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.138572 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549134-cvqtc"] Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.139368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.141593 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.141763 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.142785 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.152081 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549134-cvqtc"] Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.246282 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdc4\" (UniqueName: \"kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4\") pod \"auto-csr-approver-29549134-cvqtc\" (UID: \"bdeb988f-cd4f-474b-b681-51b543f513ef\") " pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.348505 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdc4\" (UniqueName: \"kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4\") pod \"auto-csr-approver-29549134-cvqtc\" (UID: \"bdeb988f-cd4f-474b-b681-51b543f513ef\") " pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.386964 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdc4\" (UniqueName: \"kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4\") pod \"auto-csr-approver-29549134-cvqtc\" (UID: \"bdeb988f-cd4f-474b-b681-51b543f513ef\") " pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.457110 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:00 crc kubenswrapper[4717]: I0308 05:34:00.934137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549134-cvqtc"] Mar 08 05:34:00 crc kubenswrapper[4717]: W0308 05:34:00.938761 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeb988f_cd4f_474b_b681_51b543f513ef.slice/crio-004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4 WatchSource:0}: Error finding container 004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4: Status 404 returned error can't find the container with id 004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4 Mar 08 05:34:01 crc kubenswrapper[4717]: I0308 05:34:01.316895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" event={"ID":"bdeb988f-cd4f-474b-b681-51b543f513ef","Type":"ContainerStarted","Data":"004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4"} Mar 08 05:34:01 crc kubenswrapper[4717]: I0308 05:34:01.408181 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" podUID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" containerName="registry" containerID="cri-o://bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7" gracePeriod=30 Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.907772 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999081 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999170 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999239 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99x88\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999293 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999315 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:01.999404 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca\") pod \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\" (UID: \"d7acdc7a-9697-4daf-9b82-253fcb5f1c55\") " Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.000552 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.009725 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.017013 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88" (OuterVolumeSpecName: "kube-api-access-99x88") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "kube-api-access-99x88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.017720 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.031942 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.032779 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.035152 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.047725 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d7acdc7a-9697-4daf-9b82-253fcb5f1c55" (UID: "d7acdc7a-9697-4daf-9b82-253fcb5f1c55"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101482 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99x88\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-kube-api-access-99x88\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101526 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101549 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101569 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101590 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101611 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.101631 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7acdc7a-9697-4daf-9b82-253fcb5f1c55-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.325890 4717 generic.go:334] "Generic (PLEG): container finished" podID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" containerID="bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7" exitCode=0 Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.325966 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.325959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" event={"ID":"d7acdc7a-9697-4daf-9b82-253fcb5f1c55","Type":"ContainerDied","Data":"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7"} Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.326045 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rmg8f" event={"ID":"d7acdc7a-9697-4daf-9b82-253fcb5f1c55","Type":"ContainerDied","Data":"4af17d465547681ccefd9c4f90db8a92824296b843bbcb22106efad5468a35b0"} Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.326081 4717 scope.go:117] "RemoveContainer" containerID="bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.328565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" event={"ID":"bdeb988f-cd4f-474b-b681-51b543f513ef","Type":"ContainerStarted","Data":"1eb582cdaa4328a91b72e2af556c6507fc77403ac3eedb455b86e2b240621b83"} Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.351999 4717 scope.go:117] "RemoveContainer" containerID="bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7" Mar 08 05:34:02 crc kubenswrapper[4717]: E0308 05:34:02.355261 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7\": container with ID starting with bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7 not found: ID does not exist" containerID="bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.355305 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7"} err="failed to get container status \"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7\": rpc error: code = NotFound desc = could not find container \"bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7\": container with ID starting with bf87a7cd1cc4eb869de902b4bb7beff3599fcc816ad29ef7b177839721f1c3f7 not found: ID does not exist" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.355524 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" podStartSLOduration=1.4418424810000001 podStartE2EDuration="2.355512767s" podCreationTimestamp="2026-03-08 05:34:00 +0000 UTC" firstStartedPulling="2026-03-08 05:34:00.94088901 +0000 UTC m=+467.858537854" lastFinishedPulling="2026-03-08 05:34:01.854559256 +0000 UTC m=+468.772208140" observedRunningTime="2026-03-08 05:34:02.353130438 +0000 UTC m=+469.270779322" watchObservedRunningTime="2026-03-08 05:34:02.355512767 +0000 UTC m=+469.273161611" Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.379337 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:34:02 crc kubenswrapper[4717]: I0308 05:34:02.383975 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rmg8f"] Mar 08 05:34:03 crc kubenswrapper[4717]: I0308 05:34:03.338664 4717 generic.go:334] "Generic (PLEG): container finished" podID="bdeb988f-cd4f-474b-b681-51b543f513ef" containerID="1eb582cdaa4328a91b72e2af556c6507fc77403ac3eedb455b86e2b240621b83" exitCode=0 Mar 08 05:34:03 crc kubenswrapper[4717]: I0308 05:34:03.339002 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" event={"ID":"bdeb988f-cd4f-474b-b681-51b543f513ef","Type":"ContainerDied","Data":"1eb582cdaa4328a91b72e2af556c6507fc77403ac3eedb455b86e2b240621b83"} Mar 08 05:34:03 crc kubenswrapper[4717]: I0308 05:34:03.795811 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" path="/var/lib/kubelet/pods/d7acdc7a-9697-4daf-9b82-253fcb5f1c55/volumes" Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.120510 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.120609 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.629026 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.757288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gdc4\" (UniqueName: \"kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4\") pod \"bdeb988f-cd4f-474b-b681-51b543f513ef\" (UID: \"bdeb988f-cd4f-474b-b681-51b543f513ef\") " Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.769883 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4" (OuterVolumeSpecName: "kube-api-access-4gdc4") pod "bdeb988f-cd4f-474b-b681-51b543f513ef" (UID: "bdeb988f-cd4f-474b-b681-51b543f513ef"). InnerVolumeSpecName "kube-api-access-4gdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:34:04 crc kubenswrapper[4717]: I0308 05:34:04.859862 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gdc4\" (UniqueName: \"kubernetes.io/projected/bdeb988f-cd4f-474b-b681-51b543f513ef-kube-api-access-4gdc4\") on node \"crc\" DevicePath \"\"" Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.355439 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" event={"ID":"bdeb988f-cd4f-474b-b681-51b543f513ef","Type":"ContainerDied","Data":"004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4"} Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.355883 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004700b5a0c66f319dc7d760fef7f7dd230738329474bacc86a02a6c966204e4" Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.355497 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549134-cvqtc" Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.420773 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549128-gnphx"] Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.423702 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549128-gnphx"] Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.755727 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.795420 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2" path="/var/lib/kubelet/pods/72b34140-c6e1-4ff6-a33a-f61a1a2ae1a2/volumes" Mar 08 05:34:05 crc kubenswrapper[4717]: I0308 05:34:05.835453 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5chdv" Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.120322 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.121391 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.121477 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.122745 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.122935 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79" gracePeriod=600 Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.580256 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79" exitCode=0 Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.580376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79"} Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.580795 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c"} Mar 08 05:34:34 crc kubenswrapper[4717]: I0308 05:34:34.580838 4717 scope.go:117] "RemoveContainer" containerID="50bb41c998dce6a9219837f8206ea075116fc797bd3875af9f37d6cc8a9bb92c" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.152586 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549136-f6jtt"] Mar 08 05:36:00 crc kubenswrapper[4717]: E0308 05:36:00.154060 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb988f-cd4f-474b-b681-51b543f513ef" containerName="oc" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.154091 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb988f-cd4f-474b-b681-51b543f513ef" containerName="oc" Mar 08 05:36:00 crc kubenswrapper[4717]: E0308 05:36:00.154136 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" containerName="registry" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.154150 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" containerName="registry" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.154354 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7acdc7a-9697-4daf-9b82-253fcb5f1c55" containerName="registry" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.154380 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb988f-cd4f-474b-b681-51b543f513ef" containerName="oc" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.155241 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.158847 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.159258 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.159432 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.162770 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549136-f6jtt"] Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.347239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sv27\" (UniqueName: \"kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27\") pod \"auto-csr-approver-29549136-f6jtt\" (UID: \"f1f1879f-6f47-473f-a15e-96947179c63b\") " pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.449164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sv27\" (UniqueName: \"kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27\") pod \"auto-csr-approver-29549136-f6jtt\" (UID: \"f1f1879f-6f47-473f-a15e-96947179c63b\") " pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.486629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sv27\" (UniqueName: \"kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27\") pod \"auto-csr-approver-29549136-f6jtt\" (UID: \"f1f1879f-6f47-473f-a15e-96947179c63b\") " pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:00 crc kubenswrapper[4717]: I0308 05:36:00.491789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:01 crc kubenswrapper[4717]: I0308 05:36:01.037344 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549136-f6jtt"] Mar 08 05:36:01 crc kubenswrapper[4717]: I0308 05:36:01.048949 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:36:01 crc kubenswrapper[4717]: I0308 05:36:01.297958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" event={"ID":"f1f1879f-6f47-473f-a15e-96947179c63b","Type":"ContainerStarted","Data":"b8bbdbfe159d4360ffebf2153e140f117a4f35807f87371279de648b355159d3"} Mar 08 05:36:03 crc kubenswrapper[4717]: I0308 05:36:03.333545 4717 generic.go:334] "Generic (PLEG): container finished" podID="f1f1879f-6f47-473f-a15e-96947179c63b" containerID="7d5903acb87ab40f377eabefcb5de268caedc87b7afcd55c21ceef15eff48419" exitCode=0 Mar 08 05:36:03 crc kubenswrapper[4717]: I0308 05:36:03.333757 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" event={"ID":"f1f1879f-6f47-473f-a15e-96947179c63b","Type":"ContainerDied","Data":"7d5903acb87ab40f377eabefcb5de268caedc87b7afcd55c21ceef15eff48419"} Mar 08 05:36:04 crc kubenswrapper[4717]: I0308 05:36:04.693797 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:04 crc kubenswrapper[4717]: I0308 05:36:04.824930 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sv27\" (UniqueName: \"kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27\") pod \"f1f1879f-6f47-473f-a15e-96947179c63b\" (UID: \"f1f1879f-6f47-473f-a15e-96947179c63b\") " Mar 08 05:36:04 crc kubenswrapper[4717]: I0308 05:36:04.834895 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27" (OuterVolumeSpecName: "kube-api-access-5sv27") pod "f1f1879f-6f47-473f-a15e-96947179c63b" (UID: "f1f1879f-6f47-473f-a15e-96947179c63b"). InnerVolumeSpecName "kube-api-access-5sv27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:36:04 crc kubenswrapper[4717]: I0308 05:36:04.927734 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sv27\" (UniqueName: \"kubernetes.io/projected/f1f1879f-6f47-473f-a15e-96947179c63b-kube-api-access-5sv27\") on node \"crc\" DevicePath \"\"" Mar 08 05:36:05 crc kubenswrapper[4717]: I0308 05:36:05.353537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" event={"ID":"f1f1879f-6f47-473f-a15e-96947179c63b","Type":"ContainerDied","Data":"b8bbdbfe159d4360ffebf2153e140f117a4f35807f87371279de648b355159d3"} Mar 08 05:36:05 crc kubenswrapper[4717]: I0308 05:36:05.353614 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8bbdbfe159d4360ffebf2153e140f117a4f35807f87371279de648b355159d3" Mar 08 05:36:05 crc kubenswrapper[4717]: I0308 05:36:05.353663 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549136-f6jtt" Mar 08 05:36:05 crc kubenswrapper[4717]: I0308 05:36:05.798004 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549130-sbndg"] Mar 08 05:36:05 crc kubenswrapper[4717]: I0308 05:36:05.799080 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549130-sbndg"] Mar 08 05:36:07 crc kubenswrapper[4717]: I0308 05:36:07.793746 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="662f5bb0-c453-44f7-944a-5e39e1a580e9" path="/var/lib/kubelet/pods/662f5bb0-c453-44f7-944a-5e39e1a580e9/volumes" Mar 08 05:36:23 crc kubenswrapper[4717]: I0308 05:36:23.636304 4717 scope.go:117] "RemoveContainer" containerID="4f7178bea61a5d816f4de80ed570964febc8c0976cf1b97c4e5e81ed0b42bf1d" Mar 08 05:36:23 crc kubenswrapper[4717]: I0308 05:36:23.670609 4717 scope.go:117] "RemoveContainer" containerID="624c4a7f503af7d7628529821ea94445c058fd15c3228226aabf8f4d56e7bb88" Mar 08 05:36:23 crc kubenswrapper[4717]: I0308 05:36:23.692979 4717 scope.go:117] "RemoveContainer" containerID="10c91eb7ceb9954f58af7155495ffdc92f9e0ce43fa023a49ee709e7eb89a677" Mar 08 05:36:34 crc kubenswrapper[4717]: I0308 05:36:34.119814 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:36:34 crc kubenswrapper[4717]: I0308 05:36:34.120990 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:37:04 crc kubenswrapper[4717]: I0308 05:37:04.121001 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:37:04 crc kubenswrapper[4717]: I0308 05:37:04.122263 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:37:23 crc kubenswrapper[4717]: I0308 05:37:23.772578 4717 scope.go:117] "RemoveContainer" containerID="aa4de12c4a9e874febb7019781a42ed7046d898a4eb05229908a04ca2e2ec089" Mar 08 05:37:23 crc kubenswrapper[4717]: I0308 05:37:23.823129 4717 scope.go:117] "RemoveContainer" containerID="8bb5058eeb2c4feb1b3b817f8665954f29c86c298eeadd15978e822edc7e039b" Mar 08 05:37:23 crc kubenswrapper[4717]: I0308 05:37:23.882058 4717 scope.go:117] "RemoveContainer" containerID="6843ae263f26df42b918308af5e9e8896f253a409750ad92d42291f35ed5f955" Mar 08 05:37:23 crc kubenswrapper[4717]: I0308 05:37:23.935096 4717 scope.go:117] "RemoveContainer" containerID="a07ae705cad49c7fb94fb794615bd28936bfa2e46656ac76b94da3a1af96a39f" Mar 08 05:37:34 crc kubenswrapper[4717]: I0308 05:37:34.121102 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:37:34 crc kubenswrapper[4717]: I0308 05:37:34.122243 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:37:34 crc kubenswrapper[4717]: I0308 05:37:34.122927 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:37:34 crc kubenswrapper[4717]: I0308 05:37:34.124477 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:37:34 crc kubenswrapper[4717]: I0308 05:37:34.124582 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c" gracePeriod=600 Mar 08 05:37:35 crc kubenswrapper[4717]: I0308 05:37:35.097977 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c" exitCode=0 Mar 08 05:37:35 crc kubenswrapper[4717]: I0308 05:37:35.098087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c"} Mar 08 05:37:35 crc kubenswrapper[4717]: I0308 05:37:35.098598 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812"} Mar 08 05:37:35 crc kubenswrapper[4717]: I0308 05:37:35.098633 4717 scope.go:117] "RemoveContainer" containerID="28bae32581ceebc3d7a7b35115231ff3e24099f510bd0f100a271378cb568f79" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.157393 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549138-sbrdb"] Mar 08 05:38:00 crc kubenswrapper[4717]: E0308 05:38:00.158925 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f1879f-6f47-473f-a15e-96947179c63b" containerName="oc" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.158953 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f1879f-6f47-473f-a15e-96947179c63b" containerName="oc" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.159146 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f1879f-6f47-473f-a15e-96947179c63b" containerName="oc" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.159880 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.165481 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549138-sbrdb"] Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.165882 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.165879 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.165948 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.324717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttxw\" (UniqueName: \"kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw\") pod \"auto-csr-approver-29549138-sbrdb\" (UID: \"15c2d4c4-71ce-4639-9fa9-5147173cddfb\") " pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.426078 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttxw\" (UniqueName: \"kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw\") pod \"auto-csr-approver-29549138-sbrdb\" (UID: \"15c2d4c4-71ce-4639-9fa9-5147173cddfb\") " pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.462618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttxw\" (UniqueName: \"kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw\") pod \"auto-csr-approver-29549138-sbrdb\" (UID: \"15c2d4c4-71ce-4639-9fa9-5147173cddfb\") " pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.496234 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:00 crc kubenswrapper[4717]: I0308 05:38:00.789406 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549138-sbrdb"] Mar 08 05:38:01 crc kubenswrapper[4717]: I0308 05:38:01.363261 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" event={"ID":"15c2d4c4-71ce-4639-9fa9-5147173cddfb","Type":"ContainerStarted","Data":"7219605ef25a9570e61e55c8e2406c3a587c565552b7455e6a2e4678e689abc5"} Mar 08 05:38:02 crc kubenswrapper[4717]: I0308 05:38:02.374044 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" event={"ID":"15c2d4c4-71ce-4639-9fa9-5147173cddfb","Type":"ContainerStarted","Data":"429e00fba7c737713e1649c06b86f985dcd81c81c65621a18de75d7dbbb06b54"} Mar 08 05:38:02 crc kubenswrapper[4717]: I0308 05:38:02.402153 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" podStartSLOduration=1.336074256 podStartE2EDuration="2.402125439s" podCreationTimestamp="2026-03-08 05:38:00 +0000 UTC" firstStartedPulling="2026-03-08 05:38:00.802824144 +0000 UTC m=+707.720473028" lastFinishedPulling="2026-03-08 05:38:01.868875327 +0000 UTC m=+708.786524211" observedRunningTime="2026-03-08 05:38:02.397230351 +0000 UTC m=+709.314879205" watchObservedRunningTime="2026-03-08 05:38:02.402125439 +0000 UTC m=+709.319774323" Mar 08 05:38:03 crc kubenswrapper[4717]: I0308 05:38:03.385529 4717 generic.go:334] "Generic (PLEG): container finished" podID="15c2d4c4-71ce-4639-9fa9-5147173cddfb" containerID="429e00fba7c737713e1649c06b86f985dcd81c81c65621a18de75d7dbbb06b54" exitCode=0 Mar 08 05:38:03 crc kubenswrapper[4717]: I0308 05:38:03.385640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" event={"ID":"15c2d4c4-71ce-4639-9fa9-5147173cddfb","Type":"ContainerDied","Data":"429e00fba7c737713e1649c06b86f985dcd81c81c65621a18de75d7dbbb06b54"} Mar 08 05:38:04 crc kubenswrapper[4717]: I0308 05:38:04.710104 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:04 crc kubenswrapper[4717]: I0308 05:38:04.802415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qttxw\" (UniqueName: \"kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw\") pod \"15c2d4c4-71ce-4639-9fa9-5147173cddfb\" (UID: \"15c2d4c4-71ce-4639-9fa9-5147173cddfb\") " Mar 08 05:38:04 crc kubenswrapper[4717]: I0308 05:38:04.813116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw" (OuterVolumeSpecName: "kube-api-access-qttxw") pod "15c2d4c4-71ce-4639-9fa9-5147173cddfb" (UID: "15c2d4c4-71ce-4639-9fa9-5147173cddfb"). InnerVolumeSpecName "kube-api-access-qttxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:38:04 crc kubenswrapper[4717]: I0308 05:38:04.906915 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qttxw\" (UniqueName: \"kubernetes.io/projected/15c2d4c4-71ce-4639-9fa9-5147173cddfb-kube-api-access-qttxw\") on node \"crc\" DevicePath \"\"" Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.406599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" event={"ID":"15c2d4c4-71ce-4639-9fa9-5147173cddfb","Type":"ContainerDied","Data":"7219605ef25a9570e61e55c8e2406c3a587c565552b7455e6a2e4678e689abc5"} Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.407149 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7219605ef25a9570e61e55c8e2406c3a587c565552b7455e6a2e4678e689abc5" Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.406733 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549138-sbrdb" Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.488977 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549132-ffgrj"] Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.495947 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549132-ffgrj"] Mar 08 05:38:05 crc kubenswrapper[4717]: I0308 05:38:05.795442 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b6ad0f-a029-40ca-9e23-a1794b1fb3a8" path="/var/lib/kubelet/pods/14b6ad0f-a029-40ca-9e23-a1794b1fb3a8/volumes" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.158763 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-q9jgr"] Mar 08 05:38:49 crc kubenswrapper[4717]: E0308 05:38:49.160400 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c2d4c4-71ce-4639-9fa9-5147173cddfb" containerName="oc" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.160430 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c2d4c4-71ce-4639-9fa9-5147173cddfb" containerName="oc" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.160670 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c2d4c4-71ce-4639-9fa9-5147173cddfb" containerName="oc" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.161510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-q9jgr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.163930 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2"] Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.164612 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hr7qg" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.164715 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.165052 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.165116 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.170147 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-fksjv" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.180997 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-q9jgr"] Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.205076 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2"] Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.216375 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z8plj"] Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.219848 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.222836 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-g96nr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.241937 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z8plj"] Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.270230 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwqgh\" (UniqueName: \"kubernetes.io/projected/1e9abf00-821a-412c-b6da-fa5c1f1a568a-kube-api-access-hwqgh\") pod \"cert-manager-cainjector-cf98fcc89-w5ms2\" (UID: \"1e9abf00-821a-412c-b6da-fa5c1f1a568a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.270410 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxdb\" (UniqueName: \"kubernetes.io/projected/8815e0c5-e3aa-4015-95c2-e2091a21ef2f-kube-api-access-pqxdb\") pod \"cert-manager-webhook-687f57d79b-z8plj\" (UID: \"8815e0c5-e3aa-4015-95c2-e2091a21ef2f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.270667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb7dg\" (UniqueName: \"kubernetes.io/projected/af96d97e-e051-406c-b0f5-c9d59fb60bfa-kube-api-access-xb7dg\") pod \"cert-manager-858654f9db-q9jgr\" (UID: \"af96d97e-e051-406c-b0f5-c9d59fb60bfa\") " pod="cert-manager/cert-manager-858654f9db-q9jgr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.372408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb7dg\" (UniqueName: \"kubernetes.io/projected/af96d97e-e051-406c-b0f5-c9d59fb60bfa-kube-api-access-xb7dg\") pod \"cert-manager-858654f9db-q9jgr\" (UID: \"af96d97e-e051-406c-b0f5-c9d59fb60bfa\") " pod="cert-manager/cert-manager-858654f9db-q9jgr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.372604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwqgh\" (UniqueName: \"kubernetes.io/projected/1e9abf00-821a-412c-b6da-fa5c1f1a568a-kube-api-access-hwqgh\") pod \"cert-manager-cainjector-cf98fcc89-w5ms2\" (UID: \"1e9abf00-821a-412c-b6da-fa5c1f1a568a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.372739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxdb\" (UniqueName: \"kubernetes.io/projected/8815e0c5-e3aa-4015-95c2-e2091a21ef2f-kube-api-access-pqxdb\") pod \"cert-manager-webhook-687f57d79b-z8plj\" (UID: \"8815e0c5-e3aa-4015-95c2-e2091a21ef2f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.397398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxdb\" (UniqueName: \"kubernetes.io/projected/8815e0c5-e3aa-4015-95c2-e2091a21ef2f-kube-api-access-pqxdb\") pod \"cert-manager-webhook-687f57d79b-z8plj\" (UID: \"8815e0c5-e3aa-4015-95c2-e2091a21ef2f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.400339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwqgh\" (UniqueName: \"kubernetes.io/projected/1e9abf00-821a-412c-b6da-fa5c1f1a568a-kube-api-access-hwqgh\") pod \"cert-manager-cainjector-cf98fcc89-w5ms2\" (UID: \"1e9abf00-821a-412c-b6da-fa5c1f1a568a\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.408107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb7dg\" (UniqueName: \"kubernetes.io/projected/af96d97e-e051-406c-b0f5-c9d59fb60bfa-kube-api-access-xb7dg\") pod \"cert-manager-858654f9db-q9jgr\" (UID: \"af96d97e-e051-406c-b0f5-c9d59fb60bfa\") " pod="cert-manager/cert-manager-858654f9db-q9jgr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.529781 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-q9jgr" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.539194 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.548149 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:49 crc kubenswrapper[4717]: I0308 05:38:49.925822 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z8plj"] Mar 08 05:38:50 crc kubenswrapper[4717]: W0308 05:38:50.084141 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf96d97e_e051_406c_b0f5_c9d59fb60bfa.slice/crio-323cba2693518af11a9551e8e4d597520b6868156cf93969b8e96bdb94ff0a5f WatchSource:0}: Error finding container 323cba2693518af11a9551e8e4d597520b6868156cf93969b8e96bdb94ff0a5f: Status 404 returned error can't find the container with id 323cba2693518af11a9551e8e4d597520b6868156cf93969b8e96bdb94ff0a5f Mar 08 05:38:50 crc kubenswrapper[4717]: I0308 05:38:50.086081 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-q9jgr"] Mar 08 05:38:50 crc kubenswrapper[4717]: W0308 05:38:50.090658 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e9abf00_821a_412c_b6da_fa5c1f1a568a.slice/crio-492244c1bb3fa3172a9523bd96ca44a31d238a43e8870b26ac99923939f370e2 WatchSource:0}: Error finding container 492244c1bb3fa3172a9523bd96ca44a31d238a43e8870b26ac99923939f370e2: Status 404 returned error can't find the container with id 492244c1bb3fa3172a9523bd96ca44a31d238a43e8870b26ac99923939f370e2 Mar 08 05:38:50 crc kubenswrapper[4717]: I0308 05:38:50.093286 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2"] Mar 08 05:38:50 crc kubenswrapper[4717]: I0308 05:38:50.765582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-q9jgr" event={"ID":"af96d97e-e051-406c-b0f5-c9d59fb60bfa","Type":"ContainerStarted","Data":"323cba2693518af11a9551e8e4d597520b6868156cf93969b8e96bdb94ff0a5f"} Mar 08 05:38:50 crc kubenswrapper[4717]: I0308 05:38:50.768616 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" event={"ID":"1e9abf00-821a-412c-b6da-fa5c1f1a568a","Type":"ContainerStarted","Data":"492244c1bb3fa3172a9523bd96ca44a31d238a43e8870b26ac99923939f370e2"} Mar 08 05:38:50 crc kubenswrapper[4717]: I0308 05:38:50.771210 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" event={"ID":"8815e0c5-e3aa-4015-95c2-e2091a21ef2f","Type":"ContainerStarted","Data":"26460cd7d4c2531ca43482da7762035b08c7c9122bf34b43442bb56cf37a575d"} Mar 08 05:38:52 crc kubenswrapper[4717]: I0308 05:38:52.796315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" event={"ID":"8815e0c5-e3aa-4015-95c2-e2091a21ef2f","Type":"ContainerStarted","Data":"d6350aa5739e7fa6ee456ab1950af4ab9f9898a8c9c2b5f22945c2a11bdede27"} Mar 08 05:38:52 crc kubenswrapper[4717]: I0308 05:38:52.797330 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:38:52 crc kubenswrapper[4717]: I0308 05:38:52.823738 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" podStartSLOduration=1.405296418 podStartE2EDuration="3.823718647s" podCreationTimestamp="2026-03-08 05:38:49 +0000 UTC" firstStartedPulling="2026-03-08 05:38:49.938835699 +0000 UTC m=+756.856484543" lastFinishedPulling="2026-03-08 05:38:52.357257888 +0000 UTC m=+759.274906772" observedRunningTime="2026-03-08 05:38:52.819651988 +0000 UTC m=+759.737300852" watchObservedRunningTime="2026-03-08 05:38:52.823718647 +0000 UTC m=+759.741367491" Mar 08 05:38:54 crc kubenswrapper[4717]: I0308 05:38:54.825928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-q9jgr" event={"ID":"af96d97e-e051-406c-b0f5-c9d59fb60bfa","Type":"ContainerStarted","Data":"f01cb29fd15fb65ac3544cd6a216e255c875ca438d71fde0fb3f00e34d76238b"} Mar 08 05:38:54 crc kubenswrapper[4717]: I0308 05:38:54.828508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" event={"ID":"1e9abf00-821a-412c-b6da-fa5c1f1a568a","Type":"ContainerStarted","Data":"bb320500380c18f58e7cea134b0522aa362a9ee143e618d7af37139119c23a1a"} Mar 08 05:38:54 crc kubenswrapper[4717]: I0308 05:38:54.855982 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-q9jgr" podStartSLOduration=1.894722038 podStartE2EDuration="5.855932353s" podCreationTimestamp="2026-03-08 05:38:49 +0000 UTC" firstStartedPulling="2026-03-08 05:38:50.087589448 +0000 UTC m=+757.005238332" lastFinishedPulling="2026-03-08 05:38:54.048799763 +0000 UTC m=+760.966448647" observedRunningTime="2026-03-08 05:38:54.848035731 +0000 UTC m=+761.765684615" watchObservedRunningTime="2026-03-08 05:38:54.855932353 +0000 UTC m=+761.773581227" Mar 08 05:38:54 crc kubenswrapper[4717]: I0308 05:38:54.876264 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-w5ms2" podStartSLOduration=1.93290414 podStartE2EDuration="5.876224568s" podCreationTimestamp="2026-03-08 05:38:49 +0000 UTC" firstStartedPulling="2026-03-08 05:38:50.094156369 +0000 UTC m=+757.011805243" lastFinishedPulling="2026-03-08 05:38:54.037476797 +0000 UTC m=+760.955125671" observedRunningTime="2026-03-08 05:38:54.866936542 +0000 UTC m=+761.784585426" watchObservedRunningTime="2026-03-08 05:38:54.876224568 +0000 UTC m=+761.793873462" Mar 08 05:38:59 crc kubenswrapper[4717]: I0308 05:38:59.556155 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-z8plj" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.177617 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fb27m"] Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.178987 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-controller" containerID="cri-o://9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.179833 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="sbdb" containerID="cri-o://ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.179880 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="nbdb" containerID="cri-o://17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.179917 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="northd" containerID="cri-o://e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.179951 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.179996 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-node" containerID="cri-o://18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.180047 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-acl-logging" containerID="cri-o://bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.220835 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" containerID="cri-o://0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" gracePeriod=30 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.532667 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/3.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.535823 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovn-acl-logging/0.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.536488 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovn-controller/0.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.537061 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605042 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l6dgz"] Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605328 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605342 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605350 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-acl-logging" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605356 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-acl-logging" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605364 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605370 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605378 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="sbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605384 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="sbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605394 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="northd" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605400 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="northd" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605408 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605415 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605426 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kubecfg-setup" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605432 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kubecfg-setup" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605441 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-node" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605446 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-node" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605456 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="nbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605461 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="nbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605468 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605476 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605489 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605496 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605617 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-acl-logging" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605629 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605636 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="sbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605644 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605651 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605660 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605670 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605701 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="northd" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605708 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="kube-rbac-proxy-node" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605718 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="nbdb" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605728 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovn-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.605820 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605828 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.605934 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.606242 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.606255 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerName="ovnkube-controller" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.608007 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683199 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6btr2\" (UniqueName: \"kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683286 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683888 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683939 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683998 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.683989 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684021 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684064 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash" (OuterVolumeSpecName: "host-slash") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684207 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log" (OuterVolumeSpecName: "node-log") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket" (OuterVolumeSpecName: "log-socket") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684242 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684244 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684365 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684399 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684276 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684428 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684466 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684491 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684574 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides\") pod \"b862036c-9fe5-43c3-87a4-9ff24595c456\" (UID: \"b862036c-9fe5-43c3-87a4-9ff24595c456\") " Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684670 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqg2\" (UniqueName: \"kubernetes.io/projected/0157d451-a9e1-4ad4-b526-a1107b933489-kube-api-access-bzqg2\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-script-lib\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-log-socket\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-systemd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-systemd-units\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-config\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684853 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-ovn\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-env-overrides\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-slash\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684944 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-var-lib-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-node-log\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685009 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-bin\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0157d451-a9e1-4ad4-b526-a1107b933489-ovn-node-metrics-cert\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685147 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-netns\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685175 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-etc-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-netd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-kubelet\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685300 4717 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685315 4717 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685328 4717 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-node-log\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685342 4717 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-slash\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685354 4717 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-log-socket\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685366 4717 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685378 4717 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685392 4717 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685404 4717 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684397 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.684719 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685784 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685914 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.685936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.691436 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.693173 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2" (OuterVolumeSpecName: "kube-api-access-6btr2") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "kube-api-access-6btr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.707515 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b862036c-9fe5-43c3-87a4-9ff24595c456" (UID: "b862036c-9fe5-43c3-87a4-9ff24595c456"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-systemd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787281 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-systemd-units\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787310 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-config\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787329 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-ovn\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-env-overrides\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-slash\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-var-lib-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787442 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-node-log\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787540 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-bin\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787562 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0157d451-a9e1-4ad4-b526-a1107b933489-ovn-node-metrics-cert\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787597 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-netns\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-netd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-etc-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787675 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-kubelet\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787735 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqg2\" (UniqueName: \"kubernetes.io/projected/0157d451-a9e1-4ad4-b526-a1107b933489-kube-api-access-bzqg2\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787759 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-script-lib\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-log-socket\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787849 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787864 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787878 4717 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787889 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787901 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787917 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787933 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b862036c-9fe5-43c3-87a4-9ff24595c456-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787947 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787959 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b862036c-9fe5-43c3-87a4-9ff24595c456-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787970 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6btr2\" (UniqueName: \"kubernetes.io/projected/b862036c-9fe5-43c3-87a4-9ff24595c456-kube-api-access-6btr2\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.787982 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b862036c-9fe5-43c3-87a4-9ff24595c456-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.788033 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-log-socket\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.788081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-systemd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.788109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-systemd-units\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.788877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-config\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.788929 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-ovn\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.789279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-env-overrides\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.789324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-slash\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.789357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-var-lib-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.789390 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-node-log\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.789425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.790609 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.790643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-bin\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.790825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-kubelet\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.790933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-etc-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.790989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-run-netns\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.791136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-run-openvswitch\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.791126 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0157d451-a9e1-4ad4-b526-a1107b933489-host-cni-netd\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.791551 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0157d451-a9e1-4ad4-b526-a1107b933489-ovnkube-script-lib\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.807028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0157d451-a9e1-4ad4-b526-a1107b933489-ovn-node-metrics-cert\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.818552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqg2\" (UniqueName: \"kubernetes.io/projected/0157d451-a9e1-4ad4-b526-a1107b933489-kube-api-access-bzqg2\") pod \"ovnkube-node-l6dgz\" (UID: \"0157d451-a9e1-4ad4-b526-a1107b933489\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.926006 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:10 crc kubenswrapper[4717]: W0308 05:39:10.958188 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0157d451_a9e1_4ad4_b526_a1107b933489.slice/crio-c74987f13308cd7a0161afe07561c8a7b8f9fcc0bb55faebcad0c868e6ee22d2 WatchSource:0}: Error finding container c74987f13308cd7a0161afe07561c8a7b8f9fcc0bb55faebcad0c868e6ee22d2: Status 404 returned error can't find the container with id c74987f13308cd7a0161afe07561c8a7b8f9fcc0bb55faebcad0c868e6ee22d2 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.967808 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/2.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.968293 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/1.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.968354 4717 generic.go:334] "Generic (PLEG): container finished" podID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" containerID="3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb" exitCode=2 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.968471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerDied","Data":"3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.968550 4717 scope.go:117] "RemoveContainer" containerID="251115a6de2a8ada53391b6baf31955223218643cfbe202e455242ebaa67c7c6" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.969037 4717 scope.go:117] "RemoveContainer" containerID="3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb" Mar 08 05:39:10 crc kubenswrapper[4717]: E0308 05:39:10.969241 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6f7j_openshift-multus(95c5996b-1216-4f9c-bc1f-0ca06f8de088)\"" pod="openshift-multus/multus-d6f7j" podUID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.970508 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovnkube-controller/3.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.973512 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovn-acl-logging/0.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974225 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fb27m_b862036c-9fe5-43c3-87a4-9ff24595c456/ovn-controller/0.log" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974896 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974947 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974957 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974967 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974977 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975005 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" exitCode=0 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975013 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" exitCode=143 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975021 4717 generic.go:334] "Generic (PLEG): container finished" podID="b862036c-9fe5-43c3-87a4-9ff24595c456" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" exitCode=143 Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975025 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.974977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975211 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975238 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975250 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975256 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975262 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975267 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975274 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975280 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975285 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975291 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975296 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975312 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975318 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975324 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975331 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975338 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975344 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975351 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975357 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975363 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975369 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975386 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975393 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975398 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975403 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975408 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975414 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975419 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975424 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975429 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975434 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975440 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb27m" event={"ID":"b862036c-9fe5-43c3-87a4-9ff24595c456","Type":"ContainerDied","Data":"76add08484a3a8d64393522f61f226f44d1c19c40da5e2a47a62ab9136a28699"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975448 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975459 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975465 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975471 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975476 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975482 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975489 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975494 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975500 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} Mar 08 05:39:10 crc kubenswrapper[4717]: I0308 05:39:10.975505 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.022550 4717 scope.go:117] "RemoveContainer" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.055220 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fb27m"] Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.060528 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.060647 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fb27m"] Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.085966 4717 scope.go:117] "RemoveContainer" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.109818 4717 scope.go:117] "RemoveContainer" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.186623 4717 scope.go:117] "RemoveContainer" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.203128 4717 scope.go:117] "RemoveContainer" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.217989 4717 scope.go:117] "RemoveContainer" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.232429 4717 scope.go:117] "RemoveContainer" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.245741 4717 scope.go:117] "RemoveContainer" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.265166 4717 scope.go:117] "RemoveContainer" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.283887 4717 scope.go:117] "RemoveContainer" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.284583 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": container with ID starting with 0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed not found: ID does not exist" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.284635 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} err="failed to get container status \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": rpc error: code = NotFound desc = could not find container \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": container with ID starting with 0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.284670 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.285188 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": container with ID starting with 62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808 not found: ID does not exist" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.285242 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} err="failed to get container status \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": rpc error: code = NotFound desc = could not find container \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": container with ID starting with 62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.285279 4717 scope.go:117] "RemoveContainer" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.285705 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": container with ID starting with ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246 not found: ID does not exist" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.285753 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} err="failed to get container status \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": rpc error: code = NotFound desc = could not find container \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": container with ID starting with ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.285786 4717 scope.go:117] "RemoveContainer" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.286113 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": container with ID starting with 17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347 not found: ID does not exist" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286145 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} err="failed to get container status \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": rpc error: code = NotFound desc = could not find container \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": container with ID starting with 17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286162 4717 scope.go:117] "RemoveContainer" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.286589 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": container with ID starting with e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516 not found: ID does not exist" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286612 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} err="failed to get container status \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": rpc error: code = NotFound desc = could not find container \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": container with ID starting with e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286625 4717 scope.go:117] "RemoveContainer" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.286902 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": container with ID starting with 1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902 not found: ID does not exist" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286925 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} err="failed to get container status \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": rpc error: code = NotFound desc = could not find container \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": container with ID starting with 1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.286940 4717 scope.go:117] "RemoveContainer" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.287202 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": container with ID starting with 18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6 not found: ID does not exist" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.287243 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} err="failed to get container status \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": rpc error: code = NotFound desc = could not find container \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": container with ID starting with 18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.287269 4717 scope.go:117] "RemoveContainer" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.287580 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": container with ID starting with bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2 not found: ID does not exist" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.287603 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} err="failed to get container status \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": rpc error: code = NotFound desc = could not find container \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": container with ID starting with bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.287614 4717 scope.go:117] "RemoveContainer" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.287946 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": container with ID starting with 9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e not found: ID does not exist" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.287998 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} err="failed to get container status \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": rpc error: code = NotFound desc = could not find container \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": container with ID starting with 9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.288028 4717 scope.go:117] "RemoveContainer" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: E0308 05:39:11.288501 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": container with ID starting with 705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b not found: ID does not exist" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.288548 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} err="failed to get container status \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": rpc error: code = NotFound desc = could not find container \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": container with ID starting with 705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.288578 4717 scope.go:117] "RemoveContainer" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.288994 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} err="failed to get container status \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": rpc error: code = NotFound desc = could not find container \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": container with ID starting with 0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.289035 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.289412 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} err="failed to get container status \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": rpc error: code = NotFound desc = could not find container \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": container with ID starting with 62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.289447 4717 scope.go:117] "RemoveContainer" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.289845 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} err="failed to get container status \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": rpc error: code = NotFound desc = could not find container \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": container with ID starting with ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.289890 4717 scope.go:117] "RemoveContainer" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.290186 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} err="failed to get container status \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": rpc error: code = NotFound desc = could not find container \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": container with ID starting with 17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.290218 4717 scope.go:117] "RemoveContainer" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.290572 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} err="failed to get container status \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": rpc error: code = NotFound desc = could not find container \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": container with ID starting with e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.290595 4717 scope.go:117] "RemoveContainer" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291010 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} err="failed to get container status \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": rpc error: code = NotFound desc = could not find container \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": container with ID starting with 1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291040 4717 scope.go:117] "RemoveContainer" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291348 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} err="failed to get container status \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": rpc error: code = NotFound desc = could not find container \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": container with ID starting with 18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291391 4717 scope.go:117] "RemoveContainer" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291706 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} err="failed to get container status \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": rpc error: code = NotFound desc = could not find container \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": container with ID starting with bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291725 4717 scope.go:117] "RemoveContainer" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.291980 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} err="failed to get container status \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": rpc error: code = NotFound desc = could not find container \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": container with ID starting with 9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.292020 4717 scope.go:117] "RemoveContainer" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.292426 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} err="failed to get container status \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": rpc error: code = NotFound desc = could not find container \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": container with ID starting with 705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.292453 4717 scope.go:117] "RemoveContainer" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.292793 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} err="failed to get container status \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": rpc error: code = NotFound desc = could not find container \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": container with ID starting with 0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.292832 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293137 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} err="failed to get container status \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": rpc error: code = NotFound desc = could not find container \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": container with ID starting with 62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293159 4717 scope.go:117] "RemoveContainer" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293464 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} err="failed to get container status \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": rpc error: code = NotFound desc = could not find container \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": container with ID starting with ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293484 4717 scope.go:117] "RemoveContainer" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293774 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} err="failed to get container status \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": rpc error: code = NotFound desc = could not find container \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": container with ID starting with 17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.293816 4717 scope.go:117] "RemoveContainer" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.294163 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} err="failed to get container status \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": rpc error: code = NotFound desc = could not find container \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": container with ID starting with e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.294211 4717 scope.go:117] "RemoveContainer" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.294551 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} err="failed to get container status \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": rpc error: code = NotFound desc = could not find container \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": container with ID starting with 1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.294580 4717 scope.go:117] "RemoveContainer" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.294994 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} err="failed to get container status \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": rpc error: code = NotFound desc = could not find container \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": container with ID starting with 18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.295035 4717 scope.go:117] "RemoveContainer" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.295373 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} err="failed to get container status \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": rpc error: code = NotFound desc = could not find container \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": container with ID starting with bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.295398 4717 scope.go:117] "RemoveContainer" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296049 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} err="failed to get container status \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": rpc error: code = NotFound desc = could not find container \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": container with ID starting with 9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296068 4717 scope.go:117] "RemoveContainer" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296404 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} err="failed to get container status \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": rpc error: code = NotFound desc = could not find container \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": container with ID starting with 705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296446 4717 scope.go:117] "RemoveContainer" containerID="0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296895 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed"} err="failed to get container status \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": rpc error: code = NotFound desc = could not find container \"0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed\": container with ID starting with 0cfacd3a677d4a69c72819c4cdca2883e4f15143e0305071565129f4b26298ed not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.296915 4717 scope.go:117] "RemoveContainer" containerID="62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.297258 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808"} err="failed to get container status \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": rpc error: code = NotFound desc = could not find container \"62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808\": container with ID starting with 62afb97e18ec3478f3ef57979fbe49321615ae6e62b8ac2197bdc7603cf03808 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.297295 4717 scope.go:117] "RemoveContainer" containerID="ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.297650 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246"} err="failed to get container status \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": rpc error: code = NotFound desc = could not find container \"ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246\": container with ID starting with ecefad81e8839ea460e57d4bcc6569e402e97a0f593f4837d0ca2e2ce1d2f246 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.297719 4717 scope.go:117] "RemoveContainer" containerID="17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298056 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347"} err="failed to get container status \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": rpc error: code = NotFound desc = could not find container \"17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347\": container with ID starting with 17cd59eeaa991f199d8c89f47513aa29afbc416bf64c298e21619e25bf4d4347 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298077 4717 scope.go:117] "RemoveContainer" containerID="e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298509 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516"} err="failed to get container status \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": rpc error: code = NotFound desc = could not find container \"e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516\": container with ID starting with e12ca52a865a5c3f6b7bf39d387d6a398e1cb3ecfcd7ba0b553b244747065516 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298533 4717 scope.go:117] "RemoveContainer" containerID="1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298875 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902"} err="failed to get container status \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": rpc error: code = NotFound desc = could not find container \"1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902\": container with ID starting with 1795e6ce2ce3baeabc2e2a418bee8c5e97912219f072feb17e57bbf826cd9902 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.298920 4717 scope.go:117] "RemoveContainer" containerID="18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.299282 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6"} err="failed to get container status \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": rpc error: code = NotFound desc = could not find container \"18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6\": container with ID starting with 18042a7a8eced925a59b6d7be9cdd05872f04ba1a8bef595f84d50088108bec6 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.299312 4717 scope.go:117] "RemoveContainer" containerID="bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.299706 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2"} err="failed to get container status \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": rpc error: code = NotFound desc = could not find container \"bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2\": container with ID starting with bb2bbf77df2c8e96fa66ff6bd3b679a45cdb37645b2381f4e2e737845e3c62b2 not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.299745 4717 scope.go:117] "RemoveContainer" containerID="9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.300097 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e"} err="failed to get container status \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": rpc error: code = NotFound desc = could not find container \"9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e\": container with ID starting with 9332feca4082eeed9ba05e428cdd1321e11659d8f5a17423a7562e523af8664e not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.300126 4717 scope.go:117] "RemoveContainer" containerID="705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.300446 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b"} err="failed to get container status \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": rpc error: code = NotFound desc = could not find container \"705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b\": container with ID starting with 705f3fdbb9c4dce7b03fb1e033d16aed16e0893e69b39c45a1b4fb51085fb02b not found: ID does not exist" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.791084 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b862036c-9fe5-43c3-87a4-9ff24595c456" path="/var/lib/kubelet/pods/b862036c-9fe5-43c3-87a4-9ff24595c456/volumes" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.987754 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/2.log" Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.991097 4717 generic.go:334] "Generic (PLEG): container finished" podID="0157d451-a9e1-4ad4-b526-a1107b933489" containerID="85b496a9fb0ea32212db6dcc826058a296e0d8b04258b5b825481ffd77fa7682" exitCode=0 Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.991199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerDied","Data":"85b496a9fb0ea32212db6dcc826058a296e0d8b04258b5b825481ffd77fa7682"} Mar 08 05:39:11 crc kubenswrapper[4717]: I0308 05:39:11.991298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"c74987f13308cd7a0161afe07561c8a7b8f9fcc0bb55faebcad0c868e6ee22d2"} Mar 08 05:39:13 crc kubenswrapper[4717]: I0308 05:39:13.011913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"dc3dca6dca3c0ecaf7f2516679e477cde8fbd18c1dcc33a71bc38ed509ef22b8"} Mar 08 05:39:13 crc kubenswrapper[4717]: I0308 05:39:13.012366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"a5463f6d4eaeeb40104fe5e70c9325736845050b3f3699f3c8e9a7d4679040b9"} Mar 08 05:39:13 crc kubenswrapper[4717]: I0308 05:39:13.012392 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"7025ccbb1e320a437ba2c4258c2e1de058d694f902285ec9bec6a66c65c1882f"} Mar 08 05:39:13 crc kubenswrapper[4717]: I0308 05:39:13.012417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"1fb3c7417896dc30e1484ceb92c148f51cfff3c6576f05c2b7ff6ad4b6598441"} Mar 08 05:39:13 crc kubenswrapper[4717]: I0308 05:39:13.012436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"3b98e8f0c8c41b5427912cfff00c54b1ff02bf0cca5e82e1bb28c00e887af7bf"} Mar 08 05:39:14 crc kubenswrapper[4717]: I0308 05:39:14.025933 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"ee1e8367752bbab6cc3e9256b78f45d91798b5e6b3a17f7a446e9f142bb8812a"} Mar 08 05:39:16 crc kubenswrapper[4717]: I0308 05:39:16.046066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"1a45cef7a924b80b3c4dd706610b0b0b2c810ad8c67740290c5a94a3200240b7"} Mar 08 05:39:19 crc kubenswrapper[4717]: I0308 05:39:19.075548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" event={"ID":"0157d451-a9e1-4ad4-b526-a1107b933489","Type":"ContainerStarted","Data":"5dd41962763a6e3a653d9bbcfbf81b6e7456a47b192bfba1dbfcdb2de86f9a64"} Mar 08 05:39:19 crc kubenswrapper[4717]: I0308 05:39:19.076558 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:19 crc kubenswrapper[4717]: I0308 05:39:19.076583 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:19 crc kubenswrapper[4717]: I0308 05:39:19.124760 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:19 crc kubenswrapper[4717]: I0308 05:39:19.137466 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" podStartSLOduration=9.137434232 podStartE2EDuration="9.137434232s" podCreationTimestamp="2026-03-08 05:39:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:39:19.12506837 +0000 UTC m=+786.042717244" watchObservedRunningTime="2026-03-08 05:39:19.137434232 +0000 UTC m=+786.055083106" Mar 08 05:39:20 crc kubenswrapper[4717]: I0308 05:39:20.082059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:20 crc kubenswrapper[4717]: I0308 05:39:20.130963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:24 crc kubenswrapper[4717]: I0308 05:39:24.017196 4717 scope.go:117] "RemoveContainer" containerID="d4253d896c970eacbaf4aa5f9f8157db8345f281a092d32e4503e112958f6b0f" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.718448 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c"] Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.720573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.726368 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.736118 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c"] Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.804905 4717 scope.go:117] "RemoveContainer" containerID="3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb" Mar 08 05:39:25 crc kubenswrapper[4717]: E0308 05:39:25.805492 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6f7j_openshift-multus(95c5996b-1216-4f9c-bc1f-0ca06f8de088)\"" pod="openshift-multus/multus-d6f7j" podUID="95c5996b-1216-4f9c-bc1f-0ca06f8de088" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.829833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrnz\" (UniqueName: \"kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.829910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.829962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.932182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.932514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrnz\" (UniqueName: \"kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.932576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.933373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.933459 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:25 crc kubenswrapper[4717]: I0308 05:39:25.962456 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrnz\" (UniqueName: \"kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:26 crc kubenswrapper[4717]: I0308 05:39:26.109022 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:26 crc kubenswrapper[4717]: E0308 05:39:26.154071 4717 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(64841b21a9ab9950ff77357bbd7f52af4a127ee93b828d61f56fe1b98d90b071): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:39:26 crc kubenswrapper[4717]: E0308 05:39:26.154200 4717 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(64841b21a9ab9950ff77357bbd7f52af4a127ee93b828d61f56fe1b98d90b071): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:26 crc kubenswrapper[4717]: E0308 05:39:26.154247 4717 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(64841b21a9ab9950ff77357bbd7f52af4a127ee93b828d61f56fe1b98d90b071): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:26 crc kubenswrapper[4717]: E0308 05:39:26.154340 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(64841b21a9ab9950ff77357bbd7f52af4a127ee93b828d61f56fe1b98d90b071): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" Mar 08 05:39:27 crc kubenswrapper[4717]: I0308 05:39:27.132069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:27 crc kubenswrapper[4717]: I0308 05:39:27.132933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:27 crc kubenswrapper[4717]: E0308 05:39:27.177957 4717 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(4f1807e909c10215e22c5d93273f9128e2d72ba625762371117c4fe4e9ce8d4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:39:27 crc kubenswrapper[4717]: E0308 05:39:27.178077 4717 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(4f1807e909c10215e22c5d93273f9128e2d72ba625762371117c4fe4e9ce8d4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:27 crc kubenswrapper[4717]: E0308 05:39:27.178117 4717 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(4f1807e909c10215e22c5d93273f9128e2d72ba625762371117c4fe4e9ce8d4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:27 crc kubenswrapper[4717]: E0308 05:39:27.178206 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(4f1807e909c10215e22c5d93273f9128e2d72ba625762371117c4fe4e9ce8d4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" Mar 08 05:39:34 crc kubenswrapper[4717]: I0308 05:39:34.120776 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:39:34 crc kubenswrapper[4717]: I0308 05:39:34.121397 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:39:39 crc kubenswrapper[4717]: I0308 05:39:39.781302 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:39 crc kubenswrapper[4717]: I0308 05:39:39.781907 4717 scope.go:117] "RemoveContainer" containerID="3a18e71ac14cc4af9ad8953aae2e1a8d6cfc3b1666d8ba874932aa48de8222cb" Mar 08 05:39:39 crc kubenswrapper[4717]: I0308 05:39:39.782789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:39 crc kubenswrapper[4717]: E0308 05:39:39.828905 4717 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(8adb8ca0af7b1cf7e88d62c501767b7f4282b2986b72b9b1ba118d602ecefd11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 05:39:39 crc kubenswrapper[4717]: E0308 05:39:39.829004 4717 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(8adb8ca0af7b1cf7e88d62c501767b7f4282b2986b72b9b1ba118d602ecefd11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:39 crc kubenswrapper[4717]: E0308 05:39:39.829044 4717 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(8adb8ca0af7b1cf7e88d62c501767b7f4282b2986b72b9b1ba118d602ecefd11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:39 crc kubenswrapper[4717]: E0308 05:39:39.829134 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace(8ab2fa77-0e5e-4c32-86af-eacf41b1902e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_openshift-marketplace_8ab2fa77-0e5e-4c32-86af-eacf41b1902e_0(8adb8ca0af7b1cf7e88d62c501767b7f4282b2986b72b9b1ba118d602ecefd11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" Mar 08 05:39:40 crc kubenswrapper[4717]: I0308 05:39:40.244287 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6f7j_95c5996b-1216-4f9c-bc1f-0ca06f8de088/kube-multus/2.log" Mar 08 05:39:40 crc kubenswrapper[4717]: I0308 05:39:40.244387 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6f7j" event={"ID":"95c5996b-1216-4f9c-bc1f-0ca06f8de088","Type":"ContainerStarted","Data":"46a6a23182484e1d097e4bb53d7d5fabc9318eab5d2eff554ce7dc73df2b4b94"} Mar 08 05:39:40 crc kubenswrapper[4717]: I0308 05:39:40.968928 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6dgz" Mar 08 05:39:51 crc kubenswrapper[4717]: I0308 05:39:51.781373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:51 crc kubenswrapper[4717]: I0308 05:39:51.783109 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:52 crc kubenswrapper[4717]: I0308 05:39:52.064555 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c"] Mar 08 05:39:52 crc kubenswrapper[4717]: I0308 05:39:52.344072 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerStarted","Data":"6ea9d59a8bb1cdf642c9756f644b0bdae8c3b92f9f2f129d627289326216b972"} Mar 08 05:39:52 crc kubenswrapper[4717]: I0308 05:39:52.344157 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerStarted","Data":"5248204b1b1d8af8bce0de01e908cc4865a0c7fadfd6537d8fa9e9c89662130e"} Mar 08 05:39:53 crc kubenswrapper[4717]: I0308 05:39:53.356811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerDied","Data":"6ea9d59a8bb1cdf642c9756f644b0bdae8c3b92f9f2f129d627289326216b972"} Mar 08 05:39:53 crc kubenswrapper[4717]: I0308 05:39:53.358953 4717 generic.go:334] "Generic (PLEG): container finished" podID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerID="6ea9d59a8bb1cdf642c9756f644b0bdae8c3b92f9f2f129d627289326216b972" exitCode=0 Mar 08 05:39:55 crc kubenswrapper[4717]: I0308 05:39:55.378819 4717 generic.go:334] "Generic (PLEG): container finished" podID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerID="b72a8b761af31fe8d9e5f64d714e15dde7db18f458e42e9208db9f03758cb117" exitCode=0 Mar 08 05:39:55 crc kubenswrapper[4717]: I0308 05:39:55.378952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerDied","Data":"b72a8b761af31fe8d9e5f64d714e15dde7db18f458e42e9208db9f03758cb117"} Mar 08 05:39:56 crc kubenswrapper[4717]: I0308 05:39:56.392451 4717 generic.go:334] "Generic (PLEG): container finished" podID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerID="68fac1c2d677a4a5573e4fb129e9f785db7b676b0234a0f0ceb23323f0c28585" exitCode=0 Mar 08 05:39:56 crc kubenswrapper[4717]: I0308 05:39:56.392541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerDied","Data":"68fac1c2d677a4a5573e4fb129e9f785db7b676b0234a0f0ceb23323f0c28585"} Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.741182 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.820111 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle\") pod \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.820356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util\") pod \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.823067 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle" (OuterVolumeSpecName: "bundle") pod "8ab2fa77-0e5e-4c32-86af-eacf41b1902e" (UID: "8ab2fa77-0e5e-4c32-86af-eacf41b1902e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.826980 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmrnz\" (UniqueName: \"kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz\") pod \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\" (UID: \"8ab2fa77-0e5e-4c32-86af-eacf41b1902e\") " Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.827618 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.837661 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz" (OuterVolumeSpecName: "kube-api-access-cmrnz") pod "8ab2fa77-0e5e-4c32-86af-eacf41b1902e" (UID: "8ab2fa77-0e5e-4c32-86af-eacf41b1902e"). InnerVolumeSpecName "kube-api-access-cmrnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:39:57 crc kubenswrapper[4717]: I0308 05:39:57.929545 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmrnz\" (UniqueName: \"kubernetes.io/projected/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-kube-api-access-cmrnz\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:58 crc kubenswrapper[4717]: I0308 05:39:58.108963 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util" (OuterVolumeSpecName: "util") pod "8ab2fa77-0e5e-4c32-86af-eacf41b1902e" (UID: "8ab2fa77-0e5e-4c32-86af-eacf41b1902e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:39:58 crc kubenswrapper[4717]: I0308 05:39:58.132535 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ab2fa77-0e5e-4c32-86af-eacf41b1902e-util\") on node \"crc\" DevicePath \"\"" Mar 08 05:39:58 crc kubenswrapper[4717]: I0308 05:39:58.414514 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" event={"ID":"8ab2fa77-0e5e-4c32-86af-eacf41b1902e","Type":"ContainerDied","Data":"5248204b1b1d8af8bce0de01e908cc4865a0c7fadfd6537d8fa9e9c89662130e"} Mar 08 05:39:58 crc kubenswrapper[4717]: I0308 05:39:58.414622 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5248204b1b1d8af8bce0de01e908cc4865a0c7fadfd6537d8fa9e9c89662130e" Mar 08 05:39:58 crc kubenswrapper[4717]: I0308 05:39:58.414672 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.151215 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549140-mrfvv"] Mar 08 05:40:00 crc kubenswrapper[4717]: E0308 05:40:00.151744 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="util" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.151778 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="util" Mar 08 05:40:00 crc kubenswrapper[4717]: E0308 05:40:00.151822 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="pull" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.151840 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="pull" Mar 08 05:40:00 crc kubenswrapper[4717]: E0308 05:40:00.151884 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="extract" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.151901 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="extract" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.152119 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab2fa77-0e5e-4c32-86af-eacf41b1902e" containerName="extract" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.153069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.157200 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.157604 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.159455 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.167654 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549140-mrfvv"] Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.268938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrp5\" (UniqueName: \"kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5\") pod \"auto-csr-approver-29549140-mrfvv\" (UID: \"0ac4553d-f1b4-4587-b172-04c0823d4d67\") " pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.371515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrp5\" (UniqueName: \"kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5\") pod \"auto-csr-approver-29549140-mrfvv\" (UID: \"0ac4553d-f1b4-4587-b172-04c0823d4d67\") " pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.403756 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrp5\" (UniqueName: \"kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5\") pod \"auto-csr-approver-29549140-mrfvv\" (UID: \"0ac4553d-f1b4-4587-b172-04c0823d4d67\") " pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.480222 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:00 crc kubenswrapper[4717]: I0308 05:40:00.787625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549140-mrfvv"] Mar 08 05:40:01 crc kubenswrapper[4717]: I0308 05:40:01.442848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" event={"ID":"0ac4553d-f1b4-4587-b172-04c0823d4d67","Type":"ContainerStarted","Data":"7f95f1ad3dbd519c32c73c8e3e15be3b770054a4cd2bbd478009442ae04090e2"} Mar 08 05:40:02 crc kubenswrapper[4717]: I0308 05:40:02.317017 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 05:40:03 crc kubenswrapper[4717]: I0308 05:40:03.456706 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ac4553d-f1b4-4587-b172-04c0823d4d67" containerID="bb87281bca193672a9b4398d82ec10abf47e1e32da4d7bfd427efc62a5f7fd3e" exitCode=0 Mar 08 05:40:03 crc kubenswrapper[4717]: I0308 05:40:03.456844 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" event={"ID":"0ac4553d-f1b4-4587-b172-04c0823d4d67","Type":"ContainerDied","Data":"bb87281bca193672a9b4398d82ec10abf47e1e32da4d7bfd427efc62a5f7fd3e"} Mar 08 05:40:04 crc kubenswrapper[4717]: I0308 05:40:04.119587 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:40:04 crc kubenswrapper[4717]: I0308 05:40:04.119670 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:40:04 crc kubenswrapper[4717]: I0308 05:40:04.857608 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:04 crc kubenswrapper[4717]: I0308 05:40:04.949607 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzrp5\" (UniqueName: \"kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5\") pod \"0ac4553d-f1b4-4587-b172-04c0823d4d67\" (UID: \"0ac4553d-f1b4-4587-b172-04c0823d4d67\") " Mar 08 05:40:04 crc kubenswrapper[4717]: I0308 05:40:04.963985 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5" (OuterVolumeSpecName: "kube-api-access-qzrp5") pod "0ac4553d-f1b4-4587-b172-04c0823d4d67" (UID: "0ac4553d-f1b4-4587-b172-04c0823d4d67"). InnerVolumeSpecName "kube-api-access-qzrp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.051703 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzrp5\" (UniqueName: \"kubernetes.io/projected/0ac4553d-f1b4-4587-b172-04c0823d4d67-kube-api-access-qzrp5\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.475272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" event={"ID":"0ac4553d-f1b4-4587-b172-04c0823d4d67","Type":"ContainerDied","Data":"7f95f1ad3dbd519c32c73c8e3e15be3b770054a4cd2bbd478009442ae04090e2"} Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.475335 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f95f1ad3dbd519c32c73c8e3e15be3b770054a4cd2bbd478009442ae04090e2" Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.475373 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549140-mrfvv" Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.965146 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549134-cvqtc"] Mar 08 05:40:05 crc kubenswrapper[4717]: I0308 05:40:05.968029 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549134-cvqtc"] Mar 08 05:40:07 crc kubenswrapper[4717]: I0308 05:40:07.788367 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdeb988f-cd4f-474b-b681-51b543f513ef" path="/var/lib/kubelet/pods/bdeb988f-cd4f-474b-b681-51b543f513ef/volumes" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.554292 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk"] Mar 08 05:40:09 crc kubenswrapper[4717]: E0308 05:40:09.554998 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac4553d-f1b4-4587-b172-04c0823d4d67" containerName="oc" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.555016 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac4553d-f1b4-4587-b172-04c0823d4d67" containerName="oc" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.555181 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac4553d-f1b4-4587-b172-04c0823d4d67" containerName="oc" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.555793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.560375 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.560936 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-62vr2" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.562634 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.574825 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.601284 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.602806 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.606955 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-dt8ts" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.607231 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.622330 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.623393 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.631386 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.677609 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.720183 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.720701 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.720897 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whsfp\" (UniqueName: \"kubernetes.io/projected/186bbba6-72b1-4834-9f78-65c0099a8be8-kube-api-access-whsfp\") pod \"obo-prometheus-operator-68bc856cb9-9mtbk\" (UID: \"186bbba6-72b1-4834-9f78-65c0099a8be8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.721057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.721217 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.779029 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pjlrw"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.780231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.796575 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hmnst" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.797952 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.801621 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pjlrw"] Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.823043 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whsfp\" (UniqueName: \"kubernetes.io/projected/186bbba6-72b1-4834-9f78-65c0099a8be8-kube-api-access-whsfp\") pod \"obo-prometheus-operator-68bc856cb9-9mtbk\" (UID: \"186bbba6-72b1-4834-9f78-65c0099a8be8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.823497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.823601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.823748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.823862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.830508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.831093 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e21fc32-f762-4f29-9ed5-7ab0e28be6a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd\" (UID: \"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.840321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.844640 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dad3a63c-7244-41bd-85d4-38046d2ecf3f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp\" (UID: \"dad3a63c-7244-41bd-85d4-38046d2ecf3f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.845294 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whsfp\" (UniqueName: \"kubernetes.io/projected/186bbba6-72b1-4834-9f78-65c0099a8be8-kube-api-access-whsfp\") pod \"obo-prometheus-operator-68bc856cb9-9mtbk\" (UID: \"186bbba6-72b1-4834-9f78-65c0099a8be8\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.875408 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.918083 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.927212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bs4\" (UniqueName: \"kubernetes.io/projected/5eaaec5c-b81f-4400-8237-3cb96bac6a73-kube-api-access-l9bs4\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.927356 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5eaaec5c-b81f-4400-8237-3cb96bac6a73-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:09 crc kubenswrapper[4717]: I0308 05:40:09.983079 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.028523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bs4\" (UniqueName: \"kubernetes.io/projected/5eaaec5c-b81f-4400-8237-3cb96bac6a73-kube-api-access-l9bs4\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.028634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5eaaec5c-b81f-4400-8237-3cb96bac6a73-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.035521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5eaaec5c-b81f-4400-8237-3cb96bac6a73-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.079961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bs4\" (UniqueName: \"kubernetes.io/projected/5eaaec5c-b81f-4400-8237-3cb96bac6a73-kube-api-access-l9bs4\") pod \"observability-operator-59bdc8b94-pjlrw\" (UID: \"5eaaec5c-b81f-4400-8237-3cb96bac6a73\") " pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.081158 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hmpm8"] Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.082364 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.088167 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-wcrn2" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.097872 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.098539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hmpm8"] Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.130971 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.131044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhl2\" (UniqueName: \"kubernetes.io/projected/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-kube-api-access-pzhl2\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.234651 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.234758 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhl2\" (UniqueName: \"kubernetes.io/projected/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-kube-api-access-pzhl2\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.235790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.266583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhl2\" (UniqueName: \"kubernetes.io/projected/6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5-kube-api-access-pzhl2\") pod \"perses-operator-5bf474d74f-hmpm8\" (UID: \"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5\") " pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.404436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd"] Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.413939 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.479282 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pjlrw"] Mar 08 05:40:10 crc kubenswrapper[4717]: W0308 05:40:10.499976 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eaaec5c_b81f_4400_8237_3cb96bac6a73.slice/crio-af2d34eb5d39ea78e3752a7c397ec9c30e769eec88465f52cb0c2ad9fadee2d3 WatchSource:0}: Error finding container af2d34eb5d39ea78e3752a7c397ec9c30e769eec88465f52cb0c2ad9fadee2d3: Status 404 returned error can't find the container with id af2d34eb5d39ea78e3752a7c397ec9c30e769eec88465f52cb0c2ad9fadee2d3 Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.509365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" event={"ID":"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7","Type":"ContainerStarted","Data":"eafbf622ffd5dadf40277d7eb237f3eb9efe521295e2c49df622d13c3b72880b"} Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.525615 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp"] Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.578600 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk"] Mar 08 05:40:10 crc kubenswrapper[4717]: I0308 05:40:10.709238 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-hmpm8"] Mar 08 05:40:10 crc kubenswrapper[4717]: W0308 05:40:10.715019 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dfc30c5_04dd_4c4f_96ad_9ebdbaf84dd5.slice/crio-7c6da316e8a2fee3312acec32d1da0af900f5ddc08dd21d01fc8c865e59353a0 WatchSource:0}: Error finding container 7c6da316e8a2fee3312acec32d1da0af900f5ddc08dd21d01fc8c865e59353a0: Status 404 returned error can't find the container with id 7c6da316e8a2fee3312acec32d1da0af900f5ddc08dd21d01fc8c865e59353a0 Mar 08 05:40:11 crc kubenswrapper[4717]: I0308 05:40:11.529861 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" event={"ID":"5eaaec5c-b81f-4400-8237-3cb96bac6a73","Type":"ContainerStarted","Data":"af2d34eb5d39ea78e3752a7c397ec9c30e769eec88465f52cb0c2ad9fadee2d3"} Mar 08 05:40:11 crc kubenswrapper[4717]: I0308 05:40:11.531795 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" event={"ID":"186bbba6-72b1-4834-9f78-65c0099a8be8","Type":"ContainerStarted","Data":"e5493f2ec274894031d04e25a45900482fc4a212e5490fd6b96635e60ddb4b75"} Mar 08 05:40:11 crc kubenswrapper[4717]: I0308 05:40:11.533394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" event={"ID":"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5","Type":"ContainerStarted","Data":"7c6da316e8a2fee3312acec32d1da0af900f5ddc08dd21d01fc8c865e59353a0"} Mar 08 05:40:11 crc kubenswrapper[4717]: I0308 05:40:11.536266 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" event={"ID":"dad3a63c-7244-41bd-85d4-38046d2ecf3f","Type":"ContainerStarted","Data":"5219dac4afc4a5ea532ee5092137e13cbeb1c2d986408df695e92920b798d1a0"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.630539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" event={"ID":"0e21fc32-f762-4f29-9ed5-7ab0e28be6a7","Type":"ContainerStarted","Data":"919fdda4957e72552bc7456f90a5a766eb4d905c5e3023462f823c9be390253a"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.632657 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" event={"ID":"6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5","Type":"ContainerStarted","Data":"0c3d111bf37153f24c157ca656c8b84add5383e4cf68fdbdcd3a83cc82f454f3"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.632743 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.633954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" event={"ID":"dad3a63c-7244-41bd-85d4-38046d2ecf3f","Type":"ContainerStarted","Data":"b958980507250499274282e273abf30b978ff50a57f0adb2c49d2fee0bf8686f"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.635793 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" event={"ID":"5eaaec5c-b81f-4400-8237-3cb96bac6a73","Type":"ContainerStarted","Data":"8137f2bb16334eb36dec3568b02829f0cdead3604b082358c42bde50b072eb7d"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.635949 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.637776 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" event={"ID":"186bbba6-72b1-4834-9f78-65c0099a8be8","Type":"ContainerStarted","Data":"92c325db629dd8c22f9bcab5ed805b94976921a515c24278b48f87e23f009e59"} Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.664527 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd" podStartSLOduration=2.535899496 podStartE2EDuration="14.664500003s" podCreationTimestamp="2026-03-08 05:40:09 +0000 UTC" firstStartedPulling="2026-03-08 05:40:10.432760787 +0000 UTC m=+837.350409631" lastFinishedPulling="2026-03-08 05:40:22.561361294 +0000 UTC m=+849.479010138" observedRunningTime="2026-03-08 05:40:23.655255578 +0000 UTC m=+850.572904422" watchObservedRunningTime="2026-03-08 05:40:23.664500003 +0000 UTC m=+850.582148847" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.680079 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.681204 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" podStartSLOduration=1.8310634970000002 podStartE2EDuration="13.68117992s" podCreationTimestamp="2026-03-08 05:40:10 +0000 UTC" firstStartedPulling="2026-03-08 05:40:10.719432281 +0000 UTC m=+837.637081125" lastFinishedPulling="2026-03-08 05:40:22.569548704 +0000 UTC m=+849.487197548" observedRunningTime="2026-03-08 05:40:23.676968667 +0000 UTC m=+850.594617511" watchObservedRunningTime="2026-03-08 05:40:23.68117992 +0000 UTC m=+850.598828764" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.709310 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9mtbk" podStartSLOduration=2.724802495 podStartE2EDuration="14.709282805s" podCreationTimestamp="2026-03-08 05:40:09 +0000 UTC" firstStartedPulling="2026-03-08 05:40:10.581576738 +0000 UTC m=+837.499225582" lastFinishedPulling="2026-03-08 05:40:22.566057008 +0000 UTC m=+849.483705892" observedRunningTime="2026-03-08 05:40:23.702906829 +0000 UTC m=+850.620555683" watchObservedRunningTime="2026-03-08 05:40:23.709282805 +0000 UTC m=+850.626931659" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.737205 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp" podStartSLOduration=2.7824087090000003 podStartE2EDuration="14.737174645s" podCreationTimestamp="2026-03-08 05:40:09 +0000 UTC" firstStartedPulling="2026-03-08 05:40:10.581982448 +0000 UTC m=+837.499631292" lastFinishedPulling="2026-03-08 05:40:22.536748344 +0000 UTC m=+849.454397228" observedRunningTime="2026-03-08 05:40:23.73206072 +0000 UTC m=+850.649709584" watchObservedRunningTime="2026-03-08 05:40:23.737174645 +0000 UTC m=+850.654823499" Mar 08 05:40:23 crc kubenswrapper[4717]: I0308 05:40:23.765162 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-pjlrw" podStartSLOduration=2.70581047 podStartE2EDuration="14.765139136s" podCreationTimestamp="2026-03-08 05:40:09 +0000 UTC" firstStartedPulling="2026-03-08 05:40:10.50951391 +0000 UTC m=+837.427162744" lastFinishedPulling="2026-03-08 05:40:22.568842536 +0000 UTC m=+849.486491410" observedRunningTime="2026-03-08 05:40:23.759635202 +0000 UTC m=+850.677284046" watchObservedRunningTime="2026-03-08 05:40:23.765139136 +0000 UTC m=+850.682787980" Mar 08 05:40:24 crc kubenswrapper[4717]: I0308 05:40:24.090728 4717 scope.go:117] "RemoveContainer" containerID="1eb582cdaa4328a91b72e2af556c6507fc77403ac3eedb455b86e2b240621b83" Mar 08 05:40:30 crc kubenswrapper[4717]: I0308 05:40:30.417646 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-hmpm8" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.119959 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.120640 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.120716 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.121458 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.121509 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812" gracePeriod=600 Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.712315 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812" exitCode=0 Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.712385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812"} Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.713008 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5"} Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.713035 4717 scope.go:117] "RemoveContainer" containerID="c9ab5c99eb8a5c63b3392563af981b69aeb09a7e94e4e6bbdd3f13cd801f288c" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.895752 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.897009 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.913364 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.947413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5bxs\" (UniqueName: \"kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.947511 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:34 crc kubenswrapper[4717]: I0308 05:40:34.947563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.049169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.049282 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5bxs\" (UniqueName: \"kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.049314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.050298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.050376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.084473 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5bxs\" (UniqueName: \"kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs\") pod \"redhat-marketplace-xdtt2\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.212909 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.504576 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.726012 4717 generic.go:334] "Generic (PLEG): container finished" podID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerID="d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75" exitCode=0 Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.726083 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerDied","Data":"d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75"} Mar 08 05:40:35 crc kubenswrapper[4717]: I0308 05:40:35.726134 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerStarted","Data":"df047c901c5f949040d6623e728f798703813a76d7eb35bbdcd18ea4984d0e1c"} Mar 08 05:40:36 crc kubenswrapper[4717]: I0308 05:40:36.737825 4717 generic.go:334] "Generic (PLEG): container finished" podID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerID="35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834" exitCode=0 Mar 08 05:40:36 crc kubenswrapper[4717]: I0308 05:40:36.737938 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerDied","Data":"35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834"} Mar 08 05:40:37 crc kubenswrapper[4717]: I0308 05:40:37.750680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerStarted","Data":"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d"} Mar 08 05:40:37 crc kubenswrapper[4717]: I0308 05:40:37.784436 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xdtt2" podStartSLOduration=2.340706237 podStartE2EDuration="3.784405348s" podCreationTimestamp="2026-03-08 05:40:34 +0000 UTC" firstStartedPulling="2026-03-08 05:40:35.728634967 +0000 UTC m=+862.646283811" lastFinishedPulling="2026-03-08 05:40:37.172334038 +0000 UTC m=+864.089982922" observedRunningTime="2026-03-08 05:40:37.780086633 +0000 UTC m=+864.697735487" watchObservedRunningTime="2026-03-08 05:40:37.784405348 +0000 UTC m=+864.702054212" Mar 08 05:40:45 crc kubenswrapper[4717]: I0308 05:40:45.213774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:45 crc kubenswrapper[4717]: I0308 05:40:45.214617 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:45 crc kubenswrapper[4717]: I0308 05:40:45.288121 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:45 crc kubenswrapper[4717]: I0308 05:40:45.888627 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:45 crc kubenswrapper[4717]: I0308 05:40:45.952626 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:47 crc kubenswrapper[4717]: I0308 05:40:47.827140 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xdtt2" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="registry-server" containerID="cri-o://88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d" gracePeriod=2 Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.190595 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd"] Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.192494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.196275 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.205417 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd"] Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.214149 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.260520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5bxs\" (UniqueName: \"kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs\") pod \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.260594 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities\") pod \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.260636 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content\") pod \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\" (UID: \"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772\") " Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.260949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.260994 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.261418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7p9c\" (UniqueName: \"kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.262029 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities" (OuterVolumeSpecName: "utilities") pod "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" (UID: "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.271156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs" (OuterVolumeSpecName: "kube-api-access-x5bxs") pod "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" (UID: "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772"). InnerVolumeSpecName "kube-api-access-x5bxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.293836 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" (UID: "5f3e3510-c1e1-4e6f-966d-50a6fdcb9772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7p9c\" (UniqueName: \"kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362899 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5bxs\" (UniqueName: \"kubernetes.io/projected/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-kube-api-access-x5bxs\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362917 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.362933 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.363915 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.364218 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.388999 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7p9c\" (UniqueName: \"kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.529315 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.843212 4717 generic.go:334] "Generic (PLEG): container finished" podID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerID="88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d" exitCode=0 Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.843287 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerDied","Data":"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d"} Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.843342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdtt2" event={"ID":"5f3e3510-c1e1-4e6f-966d-50a6fdcb9772","Type":"ContainerDied","Data":"df047c901c5f949040d6623e728f798703813a76d7eb35bbdcd18ea4984d0e1c"} Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.843365 4717 scope.go:117] "RemoveContainer" containerID="88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.843578 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdtt2" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.852168 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd"] Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.886177 4717 scope.go:117] "RemoveContainer" containerID="35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.907796 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.907901 4717 scope.go:117] "RemoveContainer" containerID="d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.913163 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdtt2"] Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.927156 4717 scope.go:117] "RemoveContainer" containerID="88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d" Mar 08 05:40:48 crc kubenswrapper[4717]: E0308 05:40:48.928007 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d\": container with ID starting with 88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d not found: ID does not exist" containerID="88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.928051 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d"} err="failed to get container status \"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d\": rpc error: code = NotFound desc = could not find container \"88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d\": container with ID starting with 88985551ee3d88c7ce3f7d4eb39d620ae3f3bf99b93916062ade64bc6d39c27d not found: ID does not exist" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.928078 4717 scope.go:117] "RemoveContainer" containerID="35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834" Mar 08 05:40:48 crc kubenswrapper[4717]: E0308 05:40:48.928470 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834\": container with ID starting with 35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834 not found: ID does not exist" containerID="35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.928504 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834"} err="failed to get container status \"35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834\": rpc error: code = NotFound desc = could not find container \"35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834\": container with ID starting with 35c50c0fae97edfeae2dc267ae1741d33f9420687ad3a56b77b894a057c86834 not found: ID does not exist" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.928519 4717 scope.go:117] "RemoveContainer" containerID="d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75" Mar 08 05:40:48 crc kubenswrapper[4717]: E0308 05:40:48.928796 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75\": container with ID starting with d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75 not found: ID does not exist" containerID="d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75" Mar 08 05:40:48 crc kubenswrapper[4717]: I0308 05:40:48.928811 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75"} err="failed to get container status \"d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75\": rpc error: code = NotFound desc = could not find container \"d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75\": container with ID starting with d3691ed9d51affb2d897408bf653394f0fab86970d6d89b3960fd7afc4136d75 not found: ID does not exist" Mar 08 05:40:49 crc kubenswrapper[4717]: I0308 05:40:49.798256 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" path="/var/lib/kubelet/pods/5f3e3510-c1e1-4e6f-966d-50a6fdcb9772/volumes" Mar 08 05:40:49 crc kubenswrapper[4717]: I0308 05:40:49.858188 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerID="7580bcc25e33ffc43706903e52f2e1b3ad9819e43d6b0c032820b24eeb0dc64a" exitCode=0 Mar 08 05:40:49 crc kubenswrapper[4717]: I0308 05:40:49.858269 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerDied","Data":"7580bcc25e33ffc43706903e52f2e1b3ad9819e43d6b0c032820b24eeb0dc64a"} Mar 08 05:40:49 crc kubenswrapper[4717]: I0308 05:40:49.858322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerStarted","Data":"eac145454c058b22a8072991a68ce49e6f083037d2a2d543da2bf1ce957d4664"} Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.551495 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:40:51 crc kubenswrapper[4717]: E0308 05:40:51.551858 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="extract-content" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.551873 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="extract-content" Mar 08 05:40:51 crc kubenswrapper[4717]: E0308 05:40:51.551884 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="extract-utilities" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.551890 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="extract-utilities" Mar 08 05:40:51 crc kubenswrapper[4717]: E0308 05:40:51.551897 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="registry-server" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.551904 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="registry-server" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.552003 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3e3510-c1e1-4e6f-966d-50a6fdcb9772" containerName="registry-server" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.552950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.568744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.618228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.618695 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88dzq\" (UniqueName: \"kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.618729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.719583 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.719645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88dzq\" (UniqueName: \"kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.719670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.720139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.720179 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.753259 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88dzq\" (UniqueName: \"kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq\") pod \"redhat-operators-clfds\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.872892 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.875700 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerID="c563cf538106e182e85ef1b25ebb021aa85d123de4e0230e9bd7bf35e267c191" exitCode=0 Mar 08 05:40:51 crc kubenswrapper[4717]: I0308 05:40:51.875770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerDied","Data":"c563cf538106e182e85ef1b25ebb021aa85d123de4e0230e9bd7bf35e267c191"} Mar 08 05:40:52 crc kubenswrapper[4717]: I0308 05:40:52.124237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:40:52 crc kubenswrapper[4717]: I0308 05:40:52.888166 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerStarted","Data":"4903ee637595c28d218d1a193cdeebc9924f8728cdd5c0b91ffcae7630796867"} Mar 08 05:40:53 crc kubenswrapper[4717]: I0308 05:40:53.900316 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerStarted","Data":"428b74d08dbb3289cf375c51776d9018377e0d529fe227e42d75dad39c00257d"} Mar 08 05:40:54 crc kubenswrapper[4717]: I0308 05:40:54.914483 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerID="428b74d08dbb3289cf375c51776d9018377e0d529fe227e42d75dad39c00257d" exitCode=0 Mar 08 05:40:54 crc kubenswrapper[4717]: I0308 05:40:54.914665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerDied","Data":"428b74d08dbb3289cf375c51776d9018377e0d529fe227e42d75dad39c00257d"} Mar 08 05:40:54 crc kubenswrapper[4717]: I0308 05:40:54.917109 4717 generic.go:334] "Generic (PLEG): container finished" podID="b113ae4b-122d-45a8-8731-99ce6125449e" containerID="a1e396bcda2720f8e74057cb26e58707fb1e8b620435946fab4a79ed23401a39" exitCode=0 Mar 08 05:40:54 crc kubenswrapper[4717]: I0308 05:40:54.917164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerDied","Data":"a1e396bcda2720f8e74057cb26e58707fb1e8b620435946fab4a79ed23401a39"} Mar 08 05:40:55 crc kubenswrapper[4717]: I0308 05:40:55.927951 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerStarted","Data":"c01975eded60ab163703827f6df9213aa4e0632449005c33c62f262393d56a71"} Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.260979 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.396079 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util\") pod \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.396161 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7p9c\" (UniqueName: \"kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c\") pod \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.396273 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle\") pod \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\" (UID: \"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58\") " Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.397166 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle" (OuterVolumeSpecName: "bundle") pod "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" (UID: "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.414020 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c" (OuterVolumeSpecName: "kube-api-access-k7p9c") pod "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" (UID: "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58"). InnerVolumeSpecName "kube-api-access-k7p9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.424268 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util" (OuterVolumeSpecName: "util") pod "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" (UID: "ea43c0a0-25a8-4da0-b2a3-f94c285c9e58"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.497983 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-util\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.498038 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7p9c\" (UniqueName: \"kubernetes.io/projected/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-kube-api-access-k7p9c\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.498054 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea43c0a0-25a8-4da0-b2a3-f94c285c9e58-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.940675 4717 generic.go:334] "Generic (PLEG): container finished" podID="b113ae4b-122d-45a8-8731-99ce6125449e" containerID="c01975eded60ab163703827f6df9213aa4e0632449005c33c62f262393d56a71" exitCode=0 Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.940862 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerDied","Data":"c01975eded60ab163703827f6df9213aa4e0632449005c33c62f262393d56a71"} Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.947148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" event={"ID":"ea43c0a0-25a8-4da0-b2a3-f94c285c9e58","Type":"ContainerDied","Data":"eac145454c058b22a8072991a68ce49e6f083037d2a2d543da2bf1ce957d4664"} Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.947222 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac145454c058b22a8072991a68ce49e6f083037d2a2d543da2bf1ce957d4664" Mar 08 05:40:56 crc kubenswrapper[4717]: I0308 05:40:56.947334 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd" Mar 08 05:40:57 crc kubenswrapper[4717]: I0308 05:40:57.958748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerStarted","Data":"f443226261a3b5be41609b961cfc9c055c29bc9266191a7f545a52b4a0fc773b"} Mar 08 05:40:57 crc kubenswrapper[4717]: I0308 05:40:57.985375 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-clfds" podStartSLOduration=4.555623467 podStartE2EDuration="6.985342904s" podCreationTimestamp="2026-03-08 05:40:51 +0000 UTC" firstStartedPulling="2026-03-08 05:40:54.920035934 +0000 UTC m=+881.837684818" lastFinishedPulling="2026-03-08 05:40:57.349755411 +0000 UTC m=+884.267404255" observedRunningTime="2026-03-08 05:40:57.982932666 +0000 UTC m=+884.900581550" watchObservedRunningTime="2026-03-08 05:40:57.985342904 +0000 UTC m=+884.902991748" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.822293 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p"] Mar 08 05:40:58 crc kubenswrapper[4717]: E0308 05:40:58.822568 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="pull" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.822586 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="pull" Mar 08 05:40:58 crc kubenswrapper[4717]: E0308 05:40:58.822613 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="util" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.822621 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="util" Mar 08 05:40:58 crc kubenswrapper[4717]: E0308 05:40:58.822636 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="extract" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.822643 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="extract" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.822765 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea43c0a0-25a8-4da0-b2a3-f94c285c9e58" containerName="extract" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.823213 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.826431 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.826449 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.826536 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cbxw9" Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.836302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p"] Mar 08 05:40:58 crc kubenswrapper[4717]: I0308 05:40:58.953275 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt8l\" (UniqueName: \"kubernetes.io/projected/396ff7f1-399f-4510-96a1-d17996841dba-kube-api-access-pmt8l\") pod \"nmstate-operator-75c5dccd6c-qt88p\" (UID: \"396ff7f1-399f-4510-96a1-d17996841dba\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" Mar 08 05:40:59 crc kubenswrapper[4717]: I0308 05:40:59.055524 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt8l\" (UniqueName: \"kubernetes.io/projected/396ff7f1-399f-4510-96a1-d17996841dba-kube-api-access-pmt8l\") pod \"nmstate-operator-75c5dccd6c-qt88p\" (UID: \"396ff7f1-399f-4510-96a1-d17996841dba\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" Mar 08 05:40:59 crc kubenswrapper[4717]: I0308 05:40:59.087429 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt8l\" (UniqueName: \"kubernetes.io/projected/396ff7f1-399f-4510-96a1-d17996841dba-kube-api-access-pmt8l\") pod \"nmstate-operator-75c5dccd6c-qt88p\" (UID: \"396ff7f1-399f-4510-96a1-d17996841dba\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" Mar 08 05:40:59 crc kubenswrapper[4717]: I0308 05:40:59.138028 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" Mar 08 05:40:59 crc kubenswrapper[4717]: I0308 05:40:59.639368 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p"] Mar 08 05:40:59 crc kubenswrapper[4717]: W0308 05:40:59.650781 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396ff7f1_399f_4510_96a1_d17996841dba.slice/crio-f282f214e736fd62f8d596deb3a5b0539a38cbd69f0e1e2fad49d3b32459ab8f WatchSource:0}: Error finding container f282f214e736fd62f8d596deb3a5b0539a38cbd69f0e1e2fad49d3b32459ab8f: Status 404 returned error can't find the container with id f282f214e736fd62f8d596deb3a5b0539a38cbd69f0e1e2fad49d3b32459ab8f Mar 08 05:40:59 crc kubenswrapper[4717]: I0308 05:40:59.973647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" event={"ID":"396ff7f1-399f-4510-96a1-d17996841dba","Type":"ContainerStarted","Data":"f282f214e736fd62f8d596deb3a5b0539a38cbd69f0e1e2fad49d3b32459ab8f"} Mar 08 05:41:01 crc kubenswrapper[4717]: I0308 05:41:01.874439 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:01 crc kubenswrapper[4717]: I0308 05:41:01.877005 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:02 crc kubenswrapper[4717]: I0308 05:41:02.953511 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-clfds" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="registry-server" probeResult="failure" output=< Mar 08 05:41:02 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:41:02 crc kubenswrapper[4717]: > Mar 08 05:41:03 crc kubenswrapper[4717]: I0308 05:41:03.002511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" event={"ID":"396ff7f1-399f-4510-96a1-d17996841dba","Type":"ContainerStarted","Data":"27ab079a4df379ca217430a1bf6f08e3e80a983368180401590e46a6fd66580d"} Mar 08 05:41:03 crc kubenswrapper[4717]: I0308 05:41:03.025468 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qt88p" podStartSLOduration=1.9412599 podStartE2EDuration="5.02543631s" podCreationTimestamp="2026-03-08 05:40:58 +0000 UTC" firstStartedPulling="2026-03-08 05:40:59.65447908 +0000 UTC m=+886.572127924" lastFinishedPulling="2026-03-08 05:41:02.73865548 +0000 UTC m=+889.656304334" observedRunningTime="2026-03-08 05:41:03.021547316 +0000 UTC m=+889.939196200" watchObservedRunningTime="2026-03-08 05:41:03.02543631 +0000 UTC m=+889.943085174" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.366897 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-nvc4p"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.369454 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.375226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rcd2b" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.385291 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.386390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.389057 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.394727 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-57gbg"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.396023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.404213 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-nvc4p"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.419744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.534916 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.536055 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.537652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.537734 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b82r\" (UniqueName: \"kubernetes.io/projected/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-kube-api-access-2b82r\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.537808 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e38f069-dcb2-471a-9124-87af836a0e11-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.537889 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-nmstate-lock\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.537944 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z9cz\" (UniqueName: \"kubernetes.io/projected/474b5a28-e5de-4fdc-814e-588f604686f4-kube-api-access-5z9cz\") pod \"nmstate-metrics-69594cc75-nvc4p\" (UID: \"474b5a28-e5de-4fdc-814e-588f604686f4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-dbus-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538021 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538065 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhdx\" (UniqueName: \"kubernetes.io/projected/18fa48fe-7964-43d4-8e35-f0e459dd40ea-kube-api-access-fjhdx\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-ovs-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qw6\" (UniqueName: \"kubernetes.io/projected/3e38f069-dcb2-471a-9124-87af836a0e11-kube-api-access-g6qw6\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.538912 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.539242 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.539404 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rd5tm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.551021 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.639740 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.639823 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b82r\" (UniqueName: \"kubernetes.io/projected/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-kube-api-access-2b82r\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.639896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e38f069-dcb2-471a-9124-87af836a0e11-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.639957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-nmstate-lock\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.639988 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z9cz\" (UniqueName: \"kubernetes.io/projected/474b5a28-e5de-4fdc-814e-588f604686f4-kube-api-access-5z9cz\") pod \"nmstate-metrics-69594cc75-nvc4p\" (UID: \"474b5a28-e5de-4fdc-814e-588f604686f4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-dbus-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640060 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhdx\" (UniqueName: \"kubernetes.io/projected/18fa48fe-7964-43d4-8e35-f0e459dd40ea-kube-api-access-fjhdx\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640131 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-ovs-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640150 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qw6\" (UniqueName: \"kubernetes.io/projected/3e38f069-dcb2-471a-9124-87af836a0e11-kube-api-access-g6qw6\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.640356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-nmstate-lock\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.641549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-dbus-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.641944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18fa48fe-7964-43d4-8e35-f0e459dd40ea-ovs-socket\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.642086 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.663611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3e38f069-dcb2-471a-9124-87af836a0e11-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.663715 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.669375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhdx\" (UniqueName: \"kubernetes.io/projected/18fa48fe-7964-43d4-8e35-f0e459dd40ea-kube-api-access-fjhdx\") pod \"nmstate-handler-57gbg\" (UID: \"18fa48fe-7964-43d4-8e35-f0e459dd40ea\") " pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.669456 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z9cz\" (UniqueName: \"kubernetes.io/projected/474b5a28-e5de-4fdc-814e-588f604686f4-kube-api-access-5z9cz\") pod \"nmstate-metrics-69594cc75-nvc4p\" (UID: \"474b5a28-e5de-4fdc-814e-588f604686f4\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.688925 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b82r\" (UniqueName: \"kubernetes.io/projected/a99ee055-c0a9-4a9b-8787-45f90f0e41f0-kube-api-access-2b82r\") pod \"nmstate-console-plugin-5dcbbd79cf-gcpqm\" (UID: \"a99ee055-c0a9-4a9b-8787-45f90f0e41f0\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.691172 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.702639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qw6\" (UniqueName: \"kubernetes.io/projected/3e38f069-dcb2-471a-9124-87af836a0e11-kube-api-access-g6qw6\") pod \"nmstate-webhook-786f45cff4-w6x5m\" (UID: \"3e38f069-dcb2-471a-9124-87af836a0e11\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.708015 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.732374 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.829315 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.857411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.867015 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d88669946-9ln4g"] Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.867953 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:09 crc kubenswrapper[4717]: I0308 05:41:09.881908 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d88669946-9ln4g"] Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.058871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-57gbg" event={"ID":"18fa48fe-7964-43d4-8e35-f0e459dd40ea","Type":"ContainerStarted","Data":"39da6642fc5e59cf75aacdc302869d5c49b27053512bf4740fb17b48163e0a5e"} Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.060336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-trusted-ca-bundle\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.060396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-oauth-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.060672 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8j4c\" (UniqueName: \"kubernetes.io/projected/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-kube-api-access-m8j4c\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.060806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.060902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-service-ca\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.061018 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-oauth-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.061067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.104199 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-nvc4p"] Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.129483 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm"] Mar 08 05:41:10 crc kubenswrapper[4717]: W0308 05:41:10.140058 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99ee055_c0a9_4a9b_8787_45f90f0e41f0.slice/crio-9e01c666b26887e7b4f8239f415499f1638c35fc2093d78e1d0e963edaab9bb6 WatchSource:0}: Error finding container 9e01c666b26887e7b4f8239f415499f1638c35fc2093d78e1d0e963edaab9bb6: Status 404 returned error can't find the container with id 9e01c666b26887e7b4f8239f415499f1638c35fc2093d78e1d0e963edaab9bb6 Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162651 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-oauth-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162779 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-trusted-ca-bundle\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-oauth-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8j4c\" (UniqueName: \"kubernetes.io/projected/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-kube-api-access-m8j4c\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162858 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.162881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-service-ca\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.163961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.164573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-service-ca\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.165529 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-trusted-ca-bundle\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.171020 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-oauth-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.171708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-serving-cert\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.171769 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-console-oauth-config\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.184021 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8j4c\" (UniqueName: \"kubernetes.io/projected/cef8e0ef-630e-4434-af64-cff5fc1e9d4a-kube-api-access-m8j4c\") pod \"console-6d88669946-9ln4g\" (UID: \"cef8e0ef-630e-4434-af64-cff5fc1e9d4a\") " pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.250923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.408074 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m"] Mar 08 05:41:10 crc kubenswrapper[4717]: I0308 05:41:10.506925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d88669946-9ln4g"] Mar 08 05:41:10 crc kubenswrapper[4717]: W0308 05:41:10.517782 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef8e0ef_630e_4434_af64_cff5fc1e9d4a.slice/crio-0c74b6b04acfc49c95f8216b3507f4b2828d34b4cab1174721a1dcbd5cb92665 WatchSource:0}: Error finding container 0c74b6b04acfc49c95f8216b3507f4b2828d34b4cab1174721a1dcbd5cb92665: Status 404 returned error can't find the container with id 0c74b6b04acfc49c95f8216b3507f4b2828d34b4cab1174721a1dcbd5cb92665 Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.069564 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" event={"ID":"474b5a28-e5de-4fdc-814e-588f604686f4","Type":"ContainerStarted","Data":"4907ba9ce04eead984934ef1eee1dd90f9213ba3bb55f62e6eb9397ff52d2f43"} Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.072366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d88669946-9ln4g" event={"ID":"cef8e0ef-630e-4434-af64-cff5fc1e9d4a","Type":"ContainerStarted","Data":"d6b6ec1b0ed99d1f5aa4e7b04a855d12b9d61feb6aed84a5322e97acd0b0f226"} Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.072431 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d88669946-9ln4g" event={"ID":"cef8e0ef-630e-4434-af64-cff5fc1e9d4a","Type":"ContainerStarted","Data":"0c74b6b04acfc49c95f8216b3507f4b2828d34b4cab1174721a1dcbd5cb92665"} Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.074449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" event={"ID":"a99ee055-c0a9-4a9b-8787-45f90f0e41f0","Type":"ContainerStarted","Data":"9e01c666b26887e7b4f8239f415499f1638c35fc2093d78e1d0e963edaab9bb6"} Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.076050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" event={"ID":"3e38f069-dcb2-471a-9124-87af836a0e11","Type":"ContainerStarted","Data":"9e447436b26dc5b5d3bbf1db4dc675345829248b7242d8ca114e323547e1d6ca"} Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.097193 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d88669946-9ln4g" podStartSLOduration=2.097167695 podStartE2EDuration="2.097167695s" podCreationTimestamp="2026-03-08 05:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:41:11.09204359 +0000 UTC m=+898.009692434" watchObservedRunningTime="2026-03-08 05:41:11.097167695 +0000 UTC m=+898.014816569" Mar 08 05:41:11 crc kubenswrapper[4717]: I0308 05:41:11.952901 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:12 crc kubenswrapper[4717]: I0308 05:41:12.006175 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:12 crc kubenswrapper[4717]: I0308 05:41:12.193982 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:41:13 crc kubenswrapper[4717]: I0308 05:41:13.089664 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-clfds" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="registry-server" containerID="cri-o://f443226261a3b5be41609b961cfc9c055c29bc9266191a7f545a52b4a0fc773b" gracePeriod=2 Mar 08 05:41:14 crc kubenswrapper[4717]: I0308 05:41:14.099533 4717 generic.go:334] "Generic (PLEG): container finished" podID="b113ae4b-122d-45a8-8731-99ce6125449e" containerID="f443226261a3b5be41609b961cfc9c055c29bc9266191a7f545a52b4a0fc773b" exitCode=0 Mar 08 05:41:14 crc kubenswrapper[4717]: I0308 05:41:14.100032 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerDied","Data":"f443226261a3b5be41609b961cfc9c055c29bc9266191a7f545a52b4a0fc773b"} Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.326246 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.460299 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content\") pod \"b113ae4b-122d-45a8-8731-99ce6125449e\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.460436 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88dzq\" (UniqueName: \"kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq\") pod \"b113ae4b-122d-45a8-8731-99ce6125449e\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.460626 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities\") pod \"b113ae4b-122d-45a8-8731-99ce6125449e\" (UID: \"b113ae4b-122d-45a8-8731-99ce6125449e\") " Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.462795 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities" (OuterVolumeSpecName: "utilities") pod "b113ae4b-122d-45a8-8731-99ce6125449e" (UID: "b113ae4b-122d-45a8-8731-99ce6125449e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.472532 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq" (OuterVolumeSpecName: "kube-api-access-88dzq") pod "b113ae4b-122d-45a8-8731-99ce6125449e" (UID: "b113ae4b-122d-45a8-8731-99ce6125449e"). InnerVolumeSpecName "kube-api-access-88dzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.562989 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.563036 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88dzq\" (UniqueName: \"kubernetes.io/projected/b113ae4b-122d-45a8-8731-99ce6125449e-kube-api-access-88dzq\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.623544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b113ae4b-122d-45a8-8731-99ce6125449e" (UID: "b113ae4b-122d-45a8-8731-99ce6125449e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:41:15 crc kubenswrapper[4717]: I0308 05:41:15.664611 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b113ae4b-122d-45a8-8731-99ce6125449e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:16 crc kubenswrapper[4717]: I0308 05:41:16.122110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-clfds" event={"ID":"b113ae4b-122d-45a8-8731-99ce6125449e","Type":"ContainerDied","Data":"4903ee637595c28d218d1a193cdeebc9924f8728cdd5c0b91ffcae7630796867"} Mar 08 05:41:16 crc kubenswrapper[4717]: I0308 05:41:16.122212 4717 scope.go:117] "RemoveContainer" containerID="f443226261a3b5be41609b961cfc9c055c29bc9266191a7f545a52b4a0fc773b" Mar 08 05:41:16 crc kubenswrapper[4717]: I0308 05:41:16.122237 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-clfds" Mar 08 05:41:16 crc kubenswrapper[4717]: I0308 05:41:16.159620 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:41:16 crc kubenswrapper[4717]: I0308 05:41:16.171821 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-clfds"] Mar 08 05:41:17 crc kubenswrapper[4717]: I0308 05:41:17.220577 4717 scope.go:117] "RemoveContainer" containerID="c01975eded60ab163703827f6df9213aa4e0632449005c33c62f262393d56a71" Mar 08 05:41:17 crc kubenswrapper[4717]: I0308 05:41:17.503786 4717 scope.go:117] "RemoveContainer" containerID="a1e396bcda2720f8e74057cb26e58707fb1e8b620435946fab4a79ed23401a39" Mar 08 05:41:17 crc kubenswrapper[4717]: I0308 05:41:17.794976 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" path="/var/lib/kubelet/pods/b113ae4b-122d-45a8-8731-99ce6125449e/volumes" Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.152950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" event={"ID":"a99ee055-c0a9-4a9b-8787-45f90f0e41f0","Type":"ContainerStarted","Data":"29e5b07bff3f661ec9812e47e9e393801f0974947acf11d13eb59e9294202616"} Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.158654 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" event={"ID":"3e38f069-dcb2-471a-9124-87af836a0e11","Type":"ContainerStarted","Data":"9efa899048ead3fb54593a1cf4c59a21c32f25e28460eb1d9e1d2e3c93b6ebb9"} Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.159993 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.169383 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-57gbg" event={"ID":"18fa48fe-7964-43d4-8e35-f0e459dd40ea","Type":"ContainerStarted","Data":"7422776d576bf03562e9d559be3baeda196383ddd684c104cb362bab90740df6"} Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.170120 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.172589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" event={"ID":"474b5a28-e5de-4fdc-814e-588f604686f4","Type":"ContainerStarted","Data":"040f5067c0984dc240f2e8d4340f5a58e1585833741af90400321ee94829a173"} Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.188902 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gcpqm" podStartSLOduration=1.825774859 podStartE2EDuration="9.188865601s" podCreationTimestamp="2026-03-08 05:41:09 +0000 UTC" firstStartedPulling="2026-03-08 05:41:10.1435884 +0000 UTC m=+897.061237244" lastFinishedPulling="2026-03-08 05:41:17.506679102 +0000 UTC m=+904.424327986" observedRunningTime="2026-03-08 05:41:18.184790772 +0000 UTC m=+905.102439626" watchObservedRunningTime="2026-03-08 05:41:18.188865601 +0000 UTC m=+905.106514465" Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.210236 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" podStartSLOduration=2.12322806 podStartE2EDuration="9.210210731s" podCreationTimestamp="2026-03-08 05:41:09 +0000 UTC" firstStartedPulling="2026-03-08 05:41:10.420030959 +0000 UTC m=+897.337679823" lastFinishedPulling="2026-03-08 05:41:17.50701361 +0000 UTC m=+904.424662494" observedRunningTime="2026-03-08 05:41:18.209708439 +0000 UTC m=+905.127357283" watchObservedRunningTime="2026-03-08 05:41:18.210210731 +0000 UTC m=+905.127859575" Mar 08 05:41:18 crc kubenswrapper[4717]: I0308 05:41:18.240315 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-57gbg" podStartSLOduration=1.563652 podStartE2EDuration="9.240293774s" podCreationTimestamp="2026-03-08 05:41:09 +0000 UTC" firstStartedPulling="2026-03-08 05:41:09.828937751 +0000 UTC m=+896.746586605" lastFinishedPulling="2026-03-08 05:41:17.505579495 +0000 UTC m=+904.423228379" observedRunningTime="2026-03-08 05:41:18.237959367 +0000 UTC m=+905.155608221" watchObservedRunningTime="2026-03-08 05:41:18.240293774 +0000 UTC m=+905.157942618" Mar 08 05:41:20 crc kubenswrapper[4717]: I0308 05:41:20.251305 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:20 crc kubenswrapper[4717]: I0308 05:41:20.251895 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:20 crc kubenswrapper[4717]: I0308 05:41:20.256053 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:21 crc kubenswrapper[4717]: I0308 05:41:21.207990 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d88669946-9ln4g" Mar 08 05:41:21 crc kubenswrapper[4717]: I0308 05:41:21.280306 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:41:23 crc kubenswrapper[4717]: I0308 05:41:23.221270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" event={"ID":"474b5a28-e5de-4fdc-814e-588f604686f4","Type":"ContainerStarted","Data":"87563cae33bb0ae16626190bf2e5fc25e9cb4d2818b3a78c27f6b1a7c4c89221"} Mar 08 05:41:23 crc kubenswrapper[4717]: I0308 05:41:23.244330 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-nvc4p" podStartSLOduration=2.291792795 podStartE2EDuration="14.244300081s" podCreationTimestamp="2026-03-08 05:41:09 +0000 UTC" firstStartedPulling="2026-03-08 05:41:10.110570916 +0000 UTC m=+897.028219760" lastFinishedPulling="2026-03-08 05:41:22.063078162 +0000 UTC m=+908.980727046" observedRunningTime="2026-03-08 05:41:23.242173979 +0000 UTC m=+910.159822853" watchObservedRunningTime="2026-03-08 05:41:23.244300081 +0000 UTC m=+910.161948965" Mar 08 05:41:24 crc kubenswrapper[4717]: I0308 05:41:24.779260 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-57gbg" Mar 08 05:41:29 crc kubenswrapper[4717]: I0308 05:41:29.716529 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-w6x5m" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.343099 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zhsbw" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" containerID="cri-o://376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc" gracePeriod=15 Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.754449 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zhsbw_cd118c79-042d-48f5-a360-884f4466f65b/console/0.log" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.754970 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.942672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.942813 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.942886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.942920 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.942979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.943011 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dr72\" (UniqueName: \"kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.943042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert\") pod \"cd118c79-042d-48f5-a360-884f4466f65b\" (UID: \"cd118c79-042d-48f5-a360-884f4466f65b\") " Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.943942 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.943967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.943986 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.944034 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config" (OuterVolumeSpecName: "console-config") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.951830 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.952214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72" (OuterVolumeSpecName: "kube-api-access-2dr72") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "kube-api-access-2dr72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:41:46 crc kubenswrapper[4717]: I0308 05:41:46.958530 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd118c79-042d-48f5-a360-884f4466f65b" (UID: "cd118c79-042d-48f5-a360-884f4466f65b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046080 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046125 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046141 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046154 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046171 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd118c79-042d-48f5-a360-884f4466f65b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046183 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dr72\" (UniqueName: \"kubernetes.io/projected/cd118c79-042d-48f5-a360-884f4466f65b-kube-api-access-2dr72\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.046193 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd118c79-042d-48f5-a360-884f4466f65b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.446479 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zhsbw_cd118c79-042d-48f5-a360-884f4466f65b/console/0.log" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.447237 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd118c79-042d-48f5-a360-884f4466f65b" containerID="376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc" exitCode=2 Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.447429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zhsbw" event={"ID":"cd118c79-042d-48f5-a360-884f4466f65b","Type":"ContainerDied","Data":"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc"} Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.447489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zhsbw" event={"ID":"cd118c79-042d-48f5-a360-884f4466f65b","Type":"ContainerDied","Data":"79ec35763aee427ccc25a0b90c8d02265062968bba38bd95e4cce080a036438b"} Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.447531 4717 scope.go:117] "RemoveContainer" containerID="376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.447533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zhsbw" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.478787 4717 scope.go:117] "RemoveContainer" containerID="376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc" Mar 08 05:41:47 crc kubenswrapper[4717]: E0308 05:41:47.479312 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc\": container with ID starting with 376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc not found: ID does not exist" containerID="376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.479367 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc"} err="failed to get container status \"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc\": rpc error: code = NotFound desc = could not find container \"376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc\": container with ID starting with 376dcd3abae584f0453a58f4e122a704b77f716fe4678d5889badd265fbc74cc not found: ID does not exist" Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.507017 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.517828 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zhsbw"] Mar 08 05:41:47 crc kubenswrapper[4717]: I0308 05:41:47.804999 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd118c79-042d-48f5-a360-884f4466f65b" path="/var/lib/kubelet/pods/cd118c79-042d-48f5-a360-884f4466f65b/volumes" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.531803 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt"] Mar 08 05:41:48 crc kubenswrapper[4717]: E0308 05:41:48.532190 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="extract-content" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532205 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="extract-content" Mar 08 05:41:48 crc kubenswrapper[4717]: E0308 05:41:48.532216 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="registry-server" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532221 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="registry-server" Mar 08 05:41:48 crc kubenswrapper[4717]: E0308 05:41:48.532242 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532249 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" Mar 08 05:41:48 crc kubenswrapper[4717]: E0308 05:41:48.532265 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="extract-utilities" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532271 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="extract-utilities" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532386 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b113ae4b-122d-45a8-8731-99ce6125449e" containerName="registry-server" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.532402 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd118c79-042d-48f5-a360-884f4466f65b" containerName="console" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.533422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.540489 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.548582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt"] Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.671474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.671560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.671659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrlm\" (UniqueName: \"kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.774105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.774179 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.774243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrlm\" (UniqueName: \"kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.774834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.775018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.797932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrlm\" (UniqueName: \"kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:48 crc kubenswrapper[4717]: I0308 05:41:48.851510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:49 crc kubenswrapper[4717]: I0308 05:41:49.388988 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt"] Mar 08 05:41:49 crc kubenswrapper[4717]: I0308 05:41:49.471112 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" event={"ID":"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e","Type":"ContainerStarted","Data":"4cee0cc2ec3bd53767d1ad985324c1888c06ed015874ab8c2d0bce75b266e459"} Mar 08 05:41:50 crc kubenswrapper[4717]: I0308 05:41:50.483430 4717 generic.go:334] "Generic (PLEG): container finished" podID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerID="961441c5a02d6f3959b67aec3b8d3ee959fd401a4a7ab5dc61ecc1113a3acf9d" exitCode=0 Mar 08 05:41:50 crc kubenswrapper[4717]: I0308 05:41:50.484008 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" event={"ID":"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e","Type":"ContainerDied","Data":"961441c5a02d6f3959b67aec3b8d3ee959fd401a4a7ab5dc61ecc1113a3acf9d"} Mar 08 05:41:53 crc kubenswrapper[4717]: I0308 05:41:53.515415 4717 generic.go:334] "Generic (PLEG): container finished" podID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerID="c29cbd36049849c85f8e2f9763dcaf2d5ac8bc06e31d438b62ed1cb8880e89cc" exitCode=0 Mar 08 05:41:53 crc kubenswrapper[4717]: I0308 05:41:53.515508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" event={"ID":"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e","Type":"ContainerDied","Data":"c29cbd36049849c85f8e2f9763dcaf2d5ac8bc06e31d438b62ed1cb8880e89cc"} Mar 08 05:41:54 crc kubenswrapper[4717]: I0308 05:41:54.533487 4717 generic.go:334] "Generic (PLEG): container finished" podID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerID="c7747b5110d8e69fed08ae617dd7be4cf7085de6a1cf41fbb43194f1d9d6f2a2" exitCode=0 Mar 08 05:41:54 crc kubenswrapper[4717]: I0308 05:41:54.533618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" event={"ID":"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e","Type":"ContainerDied","Data":"c7747b5110d8e69fed08ae617dd7be4cf7085de6a1cf41fbb43194f1d9d6f2a2"} Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.889218 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.898917 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrlm\" (UniqueName: \"kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm\") pod \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.898988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util\") pod \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.899118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle\") pod \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\" (UID: \"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e\") " Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.900599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle" (OuterVolumeSpecName: "bundle") pod "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" (UID: "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.909493 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm" (OuterVolumeSpecName: "kube-api-access-6rrlm") pod "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" (UID: "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e"). InnerVolumeSpecName "kube-api-access-6rrlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:41:55 crc kubenswrapper[4717]: I0308 05:41:55.912542 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util" (OuterVolumeSpecName: "util") pod "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" (UID: "0531d9bf-3f78-45d6-af95-ec8b54e8fc1e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.000973 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrlm\" (UniqueName: \"kubernetes.io/projected/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-kube-api-access-6rrlm\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.001043 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-util\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.001065 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0531d9bf-3f78-45d6-af95-ec8b54e8fc1e-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.556931 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" event={"ID":"0531d9bf-3f78-45d6-af95-ec8b54e8fc1e","Type":"ContainerDied","Data":"4cee0cc2ec3bd53767d1ad985324c1888c06ed015874ab8c2d0bce75b266e459"} Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.556999 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cee0cc2ec3bd53767d1ad985324c1888c06ed015874ab8c2d0bce75b266e459" Mar 08 05:41:56 crc kubenswrapper[4717]: I0308 05:41:56.557972 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.155058 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549142-pp5ds"] Mar 08 05:42:00 crc kubenswrapper[4717]: E0308 05:42:00.156083 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="util" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.156109 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="util" Mar 08 05:42:00 crc kubenswrapper[4717]: E0308 05:42:00.156131 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="extract" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.156143 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="extract" Mar 08 05:42:00 crc kubenswrapper[4717]: E0308 05:42:00.156174 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="pull" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.156188 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="pull" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.156376 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0531d9bf-3f78-45d6-af95-ec8b54e8fc1e" containerName="extract" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.157148 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.164956 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.167206 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549142-pp5ds"] Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.167363 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.171319 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.171581 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjhs\" (UniqueName: \"kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs\") pod \"auto-csr-approver-29549142-pp5ds\" (UID: \"db22cf04-a81d-4990-8f88-41147bf3f149\") " pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.273651 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjhs\" (UniqueName: \"kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs\") pod \"auto-csr-approver-29549142-pp5ds\" (UID: \"db22cf04-a81d-4990-8f88-41147bf3f149\") " pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.297671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjhs\" (UniqueName: \"kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs\") pod \"auto-csr-approver-29549142-pp5ds\" (UID: \"db22cf04-a81d-4990-8f88-41147bf3f149\") " pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.476620 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:00 crc kubenswrapper[4717]: I0308 05:42:00.803235 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549142-pp5ds"] Mar 08 05:42:01 crc kubenswrapper[4717]: I0308 05:42:01.616436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" event={"ID":"db22cf04-a81d-4990-8f88-41147bf3f149","Type":"ContainerStarted","Data":"5e72620d04afd374a7d236583922aa0576058c5494a9faed0c2eb3cf8d7092fe"} Mar 08 05:42:02 crc kubenswrapper[4717]: I0308 05:42:02.628040 4717 generic.go:334] "Generic (PLEG): container finished" podID="db22cf04-a81d-4990-8f88-41147bf3f149" containerID="67fa1baa53e060f43bc4bef1c40e9e9a00db4675f8154bea28df2ea9f4b62c84" exitCode=0 Mar 08 05:42:02 crc kubenswrapper[4717]: I0308 05:42:02.628124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" event={"ID":"db22cf04-a81d-4990-8f88-41147bf3f149","Type":"ContainerDied","Data":"67fa1baa53e060f43bc4bef1c40e9e9a00db4675f8154bea28df2ea9f4b62c84"} Mar 08 05:42:03 crc kubenswrapper[4717]: I0308 05:42:03.924655 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.031069 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsjhs\" (UniqueName: \"kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs\") pod \"db22cf04-a81d-4990-8f88-41147bf3f149\" (UID: \"db22cf04-a81d-4990-8f88-41147bf3f149\") " Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.037103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs" (OuterVolumeSpecName: "kube-api-access-zsjhs") pod "db22cf04-a81d-4990-8f88-41147bf3f149" (UID: "db22cf04-a81d-4990-8f88-41147bf3f149"). InnerVolumeSpecName "kube-api-access-zsjhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.132995 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsjhs\" (UniqueName: \"kubernetes.io/projected/db22cf04-a81d-4990-8f88-41147bf3f149-kube-api-access-zsjhs\") on node \"crc\" DevicePath \"\"" Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.654419 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" event={"ID":"db22cf04-a81d-4990-8f88-41147bf3f149","Type":"ContainerDied","Data":"5e72620d04afd374a7d236583922aa0576058c5494a9faed0c2eb3cf8d7092fe"} Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.654912 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e72620d04afd374a7d236583922aa0576058c5494a9faed0c2eb3cf8d7092fe" Mar 08 05:42:04 crc kubenswrapper[4717]: I0308 05:42:04.654493 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549142-pp5ds" Mar 08 05:42:05 crc kubenswrapper[4717]: I0308 05:42:05.010611 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549136-f6jtt"] Mar 08 05:42:05 crc kubenswrapper[4717]: I0308 05:42:05.019661 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549136-f6jtt"] Mar 08 05:42:05 crc kubenswrapper[4717]: I0308 05:42:05.795360 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f1879f-6f47-473f-a15e-96947179c63b" path="/var/lib/kubelet/pods/f1f1879f-6f47-473f-a15e-96947179c63b/volumes" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.881444 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz"] Mar 08 05:42:06 crc kubenswrapper[4717]: E0308 05:42:06.881789 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db22cf04-a81d-4990-8f88-41147bf3f149" containerName="oc" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.881805 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db22cf04-a81d-4990-8f88-41147bf3f149" containerName="oc" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.881919 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db22cf04-a81d-4990-8f88-41147bf3f149" containerName="oc" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.882441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.885881 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.886421 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.887407 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.887519 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.887894 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-848qp" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.958677 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz"] Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.980350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-webhook-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.980417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-apiservice-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:06 crc kubenswrapper[4717]: I0308 05:42:06.980505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knt66\" (UniqueName: \"kubernetes.io/projected/de9b0a52-bf0f-4566-bcb4-f52c31916a41-kube-api-access-knt66\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.081660 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-apiservice-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.081781 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knt66\" (UniqueName: \"kubernetes.io/projected/de9b0a52-bf0f-4566-bcb4-f52c31916a41-kube-api-access-knt66\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.081854 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-webhook-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.089857 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-apiservice-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.102794 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knt66\" (UniqueName: \"kubernetes.io/projected/de9b0a52-bf0f-4566-bcb4-f52c31916a41-kube-api-access-knt66\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.104800 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de9b0a52-bf0f-4566-bcb4-f52c31916a41-webhook-cert\") pod \"metallb-operator-controller-manager-7d59c89549-fbpjz\" (UID: \"de9b0a52-bf0f-4566-bcb4-f52c31916a41\") " pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.199831 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.211647 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f74747698-24c78"] Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.212577 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.215712 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.215944 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.218407 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xp7j9" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.228452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f74747698-24c78"] Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.386090 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-webhook-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.386195 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv9bf\" (UniqueName: \"kubernetes.io/projected/3114eda7-af43-45d9-955c-116f643af398-kube-api-access-bv9bf\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.386229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-apiservice-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.487788 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-webhook-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.488402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv9bf\" (UniqueName: \"kubernetes.io/projected/3114eda7-af43-45d9-955c-116f643af398-kube-api-access-bv9bf\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.488433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-apiservice-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.496046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-webhook-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.501478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3114eda7-af43-45d9-955c-116f643af398-apiservice-cert\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.520225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv9bf\" (UniqueName: \"kubernetes.io/projected/3114eda7-af43-45d9-955c-116f643af398-kube-api-access-bv9bf\") pod \"metallb-operator-webhook-server-7f74747698-24c78\" (UID: \"3114eda7-af43-45d9-955c-116f643af398\") " pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.573397 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.672380 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz"] Mar 08 05:42:07 crc kubenswrapper[4717]: W0308 05:42:07.729212 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9b0a52_bf0f_4566_bcb4_f52c31916a41.slice/crio-a3a01cfd275f9b84ab084517554cffea82649b0afa2150dca59f0fd20083d526 WatchSource:0}: Error finding container a3a01cfd275f9b84ab084517554cffea82649b0afa2150dca59f0fd20083d526: Status 404 returned error can't find the container with id a3a01cfd275f9b84ab084517554cffea82649b0afa2150dca59f0fd20083d526 Mar 08 05:42:07 crc kubenswrapper[4717]: I0308 05:42:07.875801 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f74747698-24c78"] Mar 08 05:42:07 crc kubenswrapper[4717]: W0308 05:42:07.881910 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3114eda7_af43_45d9_955c_116f643af398.slice/crio-7c091a4eb99fc9869f2657d7cbaf688d957ae9eeede0040cf09cd15c08bdd637 WatchSource:0}: Error finding container 7c091a4eb99fc9869f2657d7cbaf688d957ae9eeede0040cf09cd15c08bdd637: Status 404 returned error can't find the container with id 7c091a4eb99fc9869f2657d7cbaf688d957ae9eeede0040cf09cd15c08bdd637 Mar 08 05:42:08 crc kubenswrapper[4717]: I0308 05:42:08.694033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" event={"ID":"de9b0a52-bf0f-4566-bcb4-f52c31916a41","Type":"ContainerStarted","Data":"a3a01cfd275f9b84ab084517554cffea82649b0afa2150dca59f0fd20083d526"} Mar 08 05:42:08 crc kubenswrapper[4717]: I0308 05:42:08.695275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" event={"ID":"3114eda7-af43-45d9-955c-116f643af398","Type":"ContainerStarted","Data":"7c091a4eb99fc9869f2657d7cbaf688d957ae9eeede0040cf09cd15c08bdd637"} Mar 08 05:42:13 crc kubenswrapper[4717]: I0308 05:42:13.741748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" event={"ID":"de9b0a52-bf0f-4566-bcb4-f52c31916a41","Type":"ContainerStarted","Data":"a718d24abec3413dbd00fa29bc530641201e68cb757a5fb2a615a3bfed366a21"} Mar 08 05:42:13 crc kubenswrapper[4717]: I0308 05:42:13.742734 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:13 crc kubenswrapper[4717]: I0308 05:42:13.744853 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" event={"ID":"3114eda7-af43-45d9-955c-116f643af398","Type":"ContainerStarted","Data":"0d2b359eae82988f640fcd65ecc38c6c837adbf5de1fc6322bdc4e2d200feddb"} Mar 08 05:42:13 crc kubenswrapper[4717]: I0308 05:42:13.745054 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:13 crc kubenswrapper[4717]: I0308 05:42:13.795546 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" podStartSLOduration=2.234209662 podStartE2EDuration="7.795520612s" podCreationTimestamp="2026-03-08 05:42:06 +0000 UTC" firstStartedPulling="2026-03-08 05:42:07.73720593 +0000 UTC m=+954.654854774" lastFinishedPulling="2026-03-08 05:42:13.29851689 +0000 UTC m=+960.216165724" observedRunningTime="2026-03-08 05:42:13.792561159 +0000 UTC m=+960.710210013" watchObservedRunningTime="2026-03-08 05:42:13.795520612 +0000 UTC m=+960.713169466" Mar 08 05:42:24 crc kubenswrapper[4717]: I0308 05:42:24.275117 4717 scope.go:117] "RemoveContainer" containerID="7d5903acb87ab40f377eabefcb5de268caedc87b7afcd55c21ceef15eff48419" Mar 08 05:42:27 crc kubenswrapper[4717]: I0308 05:42:27.581510 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" Mar 08 05:42:27 crc kubenswrapper[4717]: I0308 05:42:27.612803 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f74747698-24c78" podStartSLOduration=15.174464712 podStartE2EDuration="20.612771634s" podCreationTimestamp="2026-03-08 05:42:07 +0000 UTC" firstStartedPulling="2026-03-08 05:42:07.885893354 +0000 UTC m=+954.803542198" lastFinishedPulling="2026-03-08 05:42:13.324200266 +0000 UTC m=+960.241849120" observedRunningTime="2026-03-08 05:42:13.82788638 +0000 UTC m=+960.745535224" watchObservedRunningTime="2026-03-08 05:42:27.612771634 +0000 UTC m=+974.530420518" Mar 08 05:42:34 crc kubenswrapper[4717]: I0308 05:42:34.120330 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:42:34 crc kubenswrapper[4717]: I0308 05:42:34.121349 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.024766 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.028313 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.057459 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.092286 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.092478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.092532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrhg\" (UniqueName: \"kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.194827 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.194945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.194972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrhg\" (UniqueName: \"kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.195738 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.195854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.227325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrhg\" (UniqueName: \"kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg\") pod \"community-operators-m6v47\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.390056 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.935712 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:42 crc kubenswrapper[4717]: I0308 05:42:42.990924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerStarted","Data":"806a8629a212b7103563956afd77f4af172e6cb0fc3cfc5a093bcd4ef0dfa8bc"} Mar 08 05:42:44 crc kubenswrapper[4717]: I0308 05:42:44.001639 4717 generic.go:334] "Generic (PLEG): container finished" podID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerID="450c6251ec5f401560536e3572697a699e88a427ca7f03bc5b1b9b4f190c54e6" exitCode=0 Mar 08 05:42:44 crc kubenswrapper[4717]: I0308 05:42:44.001751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerDied","Data":"450c6251ec5f401560536e3572697a699e88a427ca7f03bc5b1b9b4f190c54e6"} Mar 08 05:42:46 crc kubenswrapper[4717]: I0308 05:42:46.960851 4717 generic.go:334] "Generic (PLEG): container finished" podID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerID="3bfcb1af7dc5a7e5f6d9a27621d5af28548b0a6c6ee77313ad02b6d363137600" exitCode=0 Mar 08 05:42:46 crc kubenswrapper[4717]: I0308 05:42:46.960923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerDied","Data":"3bfcb1af7dc5a7e5f6d9a27621d5af28548b0a6c6ee77313ad02b6d363137600"} Mar 08 05:42:47 crc kubenswrapper[4717]: I0308 05:42:47.204289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d59c89549-fbpjz" Mar 08 05:42:47 crc kubenswrapper[4717]: I0308 05:42:47.970837 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerStarted","Data":"daf49b0669e331b59755b6f395776b8cb33cfb65596ef849d8cc1df37416246a"} Mar 08 05:42:47 crc kubenswrapper[4717]: I0308 05:42:47.997528 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6v47" podStartSLOduration=3.572349621 podStartE2EDuration="6.997502967s" podCreationTimestamp="2026-03-08 05:42:41 +0000 UTC" firstStartedPulling="2026-03-08 05:42:44.00374192 +0000 UTC m=+990.921390774" lastFinishedPulling="2026-03-08 05:42:47.428895276 +0000 UTC m=+994.346544120" observedRunningTime="2026-03-08 05:42:47.992938314 +0000 UTC m=+994.910587168" watchObservedRunningTime="2026-03-08 05:42:47.997502967 +0000 UTC m=+994.915151801" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.426215 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wqp4w"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.430395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.437950 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.439020 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.442525 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.442801 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.443259 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7phjm" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.443422 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.459492 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.507962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-conf\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics-certs\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-startup\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508090 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkv98\" (UniqueName: \"kubernetes.io/projected/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-kube-api-access-dkv98\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc45z\" (UniqueName: \"kubernetes.io/projected/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-kube-api-access-cc45z\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-reloader\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.508375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-sockets\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.545106 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-27q99"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.546460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.553204 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pxjl9" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.553991 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.555548 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.559589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.568581 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-96ksw"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.570067 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.574338 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.581044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-96ksw"] Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc45z\" (UniqueName: \"kubernetes.io/projected/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-kube-api-access-cc45z\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610078 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metallb-excludel2\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-reloader\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-sockets\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2hjz\" (UniqueName: \"kubernetes.io/projected/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-kube-api-access-q2hjz\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metrics-certs\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-conf\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics-certs\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-startup\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkv98\" (UniqueName: \"kubernetes.io/projected/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-kube-api-access-dkv98\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.610346 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.611246 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-reloader\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.611366 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.611588 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-sockets\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.611834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-conf\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.612466 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-frr-startup\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.619847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.622320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-metrics-certs\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.632663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc45z\" (UniqueName: \"kubernetes.io/projected/804f0686-e4ef-4cd6-bbe2-a2e7788759e2-kube-api-access-cc45z\") pod \"frr-k8s-wqp4w\" (UID: \"804f0686-e4ef-4cd6-bbe2-a2e7788759e2\") " pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.635544 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkv98\" (UniqueName: \"kubernetes.io/projected/88b4e6f2-24f0-4e67-ab40-e3621ab5b44f-kube-api-access-dkv98\") pod \"frr-k8s-webhook-server-7f989f654f-vwtjg\" (UID: \"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711492 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2hjz\" (UniqueName: \"kubernetes.io/projected/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-kube-api-access-q2hjz\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711515 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zsw\" (UniqueName: \"kubernetes.io/projected/65f01f58-dbf5-4547-9249-ab613d4f85db-kube-api-access-g7zsw\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711547 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-cert\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711564 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metrics-certs\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-metrics-certs\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.711646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metallb-excludel2\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: E0308 05:42:48.712017 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 05:42:48 crc kubenswrapper[4717]: E0308 05:42:48.712126 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist podName:b4a0e98d-c9c3-4d97-ab3a-cd63903fd104 nodeName:}" failed. No retries permitted until 2026-03-08 05:42:49.212096102 +0000 UTC m=+996.129744946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist") pod "speaker-27q99" (UID: "b4a0e98d-c9c3-4d97-ab3a-cd63903fd104") : secret "metallb-memberlist" not found Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.712412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metallb-excludel2\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.715236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-metrics-certs\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.738581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2hjz\" (UniqueName: \"kubernetes.io/projected/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-kube-api-access-q2hjz\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.756464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.765388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.833228 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zsw\" (UniqueName: \"kubernetes.io/projected/65f01f58-dbf5-4547-9249-ab613d4f85db-kube-api-access-g7zsw\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.833308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-cert\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.833342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-metrics-certs\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.837874 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.838527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-metrics-certs\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.849398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/65f01f58-dbf5-4547-9249-ab613d4f85db-cert\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.858747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zsw\" (UniqueName: \"kubernetes.io/projected/65f01f58-dbf5-4547-9249-ab613d4f85db-kube-api-access-g7zsw\") pod \"controller-86ddb6bd46-96ksw\" (UID: \"65f01f58-dbf5-4547-9249-ab613d4f85db\") " pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.889386 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:48 crc kubenswrapper[4717]: I0308 05:42:48.987132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"66efea0a337db26bf6d8a2b32b34d8e345ff93d03406544053156c6803c0d314"} Mar 08 05:42:49 crc kubenswrapper[4717]: I0308 05:42:49.045502 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg"] Mar 08 05:42:49 crc kubenswrapper[4717]: W0308 05:42:49.052732 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88b4e6f2_24f0_4e67_ab40_e3621ab5b44f.slice/crio-d594b384518d18b282a83e970721a649df73653ce4a3419e81e60823b3f8a041 WatchSource:0}: Error finding container d594b384518d18b282a83e970721a649df73653ce4a3419e81e60823b3f8a041: Status 404 returned error can't find the container with id d594b384518d18b282a83e970721a649df73653ce4a3419e81e60823b3f8a041 Mar 08 05:42:49 crc kubenswrapper[4717]: I0308 05:42:49.125201 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-96ksw"] Mar 08 05:42:49 crc kubenswrapper[4717]: I0308 05:42:49.242502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:49 crc kubenswrapper[4717]: E0308 05:42:49.242755 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 05:42:49 crc kubenswrapper[4717]: E0308 05:42:49.242858 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist podName:b4a0e98d-c9c3-4d97-ab3a-cd63903fd104 nodeName:}" failed. No retries permitted until 2026-03-08 05:42:50.242832497 +0000 UTC m=+997.160481341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist") pod "speaker-27q99" (UID: "b4a0e98d-c9c3-4d97-ab3a-cd63903fd104") : secret "metallb-memberlist" not found Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.000954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" event={"ID":"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f","Type":"ContainerStarted","Data":"d594b384518d18b282a83e970721a649df73653ce4a3419e81e60823b3f8a041"} Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.003992 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-96ksw" event={"ID":"65f01f58-dbf5-4547-9249-ab613d4f85db","Type":"ContainerStarted","Data":"812471039a3967dd45dd5b2fe7bc4b7b2da178bec5b06a0c7526b6b9b675eece"} Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.004041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-96ksw" event={"ID":"65f01f58-dbf5-4547-9249-ab613d4f85db","Type":"ContainerStarted","Data":"d80ce4b1117bf720adc4e068028624cf26e42b3851172e44136f8b9d4506219a"} Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.004062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-96ksw" event={"ID":"65f01f58-dbf5-4547-9249-ab613d4f85db","Type":"ContainerStarted","Data":"352759b590af7218ba75069ef168640861248f6c60b6d4a49b18c918b4d20d07"} Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.004264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.036649 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-96ksw" podStartSLOduration=2.036616867 podStartE2EDuration="2.036616867s" podCreationTimestamp="2026-03-08 05:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:42:50.034553056 +0000 UTC m=+996.952201910" watchObservedRunningTime="2026-03-08 05:42:50.036616867 +0000 UTC m=+996.954265751" Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.259908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.283233 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4a0e98d-c9c3-4d97-ab3a-cd63903fd104-memberlist\") pod \"speaker-27q99\" (UID: \"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104\") " pod="metallb-system/speaker-27q99" Mar 08 05:42:50 crc kubenswrapper[4717]: I0308 05:42:50.364285 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-27q99" Mar 08 05:42:51 crc kubenswrapper[4717]: I0308 05:42:51.026151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27q99" event={"ID":"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104","Type":"ContainerStarted","Data":"6ab3722c6c106da9cca61a976d18e807758847efd40a2d6c85544ba5611a5bb3"} Mar 08 05:42:51 crc kubenswrapper[4717]: I0308 05:42:51.028043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27q99" event={"ID":"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104","Type":"ContainerStarted","Data":"012341b995ed619d9e07ddbe60d5888949260bf1556f000834736f57d37f7c46"} Mar 08 05:42:51 crc kubenswrapper[4717]: I0308 05:42:51.028088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-27q99" event={"ID":"b4a0e98d-c9c3-4d97-ab3a-cd63903fd104","Type":"ContainerStarted","Data":"a0989cd3975a68260654834a0f21c0a08611d3e25e7796cc9392207638dbe2da"} Mar 08 05:42:51 crc kubenswrapper[4717]: I0308 05:42:51.029503 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-27q99" Mar 08 05:42:51 crc kubenswrapper[4717]: I0308 05:42:51.048094 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-27q99" podStartSLOduration=3.048065193 podStartE2EDuration="3.048065193s" podCreationTimestamp="2026-03-08 05:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:42:51.047192842 +0000 UTC m=+997.964841686" watchObservedRunningTime="2026-03-08 05:42:51.048065193 +0000 UTC m=+997.965714037" Mar 08 05:42:52 crc kubenswrapper[4717]: I0308 05:42:52.391015 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:52 crc kubenswrapper[4717]: I0308 05:42:52.392165 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:52 crc kubenswrapper[4717]: I0308 05:42:52.451264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:53 crc kubenswrapper[4717]: I0308 05:42:53.102378 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:53 crc kubenswrapper[4717]: I0308 05:42:53.150727 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:55 crc kubenswrapper[4717]: I0308 05:42:55.062809 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6v47" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="registry-server" containerID="cri-o://daf49b0669e331b59755b6f395776b8cb33cfb65596ef849d8cc1df37416246a" gracePeriod=2 Mar 08 05:42:56 crc kubenswrapper[4717]: I0308 05:42:56.079907 4717 generic.go:334] "Generic (PLEG): container finished" podID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerID="daf49b0669e331b59755b6f395776b8cb33cfb65596ef849d8cc1df37416246a" exitCode=0 Mar 08 05:42:56 crc kubenswrapper[4717]: I0308 05:42:56.079994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerDied","Data":"daf49b0669e331b59755b6f395776b8cb33cfb65596ef849d8cc1df37416246a"} Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.433082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.493440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content\") pod \"3bffbae6-69e3-46b7-bb39-af321868f9d8\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.493538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities\") pod \"3bffbae6-69e3-46b7-bb39-af321868f9d8\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.493610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swrhg\" (UniqueName: \"kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg\") pod \"3bffbae6-69e3-46b7-bb39-af321868f9d8\" (UID: \"3bffbae6-69e3-46b7-bb39-af321868f9d8\") " Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.494766 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities" (OuterVolumeSpecName: "utilities") pod "3bffbae6-69e3-46b7-bb39-af321868f9d8" (UID: "3bffbae6-69e3-46b7-bb39-af321868f9d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.501332 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg" (OuterVolumeSpecName: "kube-api-access-swrhg") pod "3bffbae6-69e3-46b7-bb39-af321868f9d8" (UID: "3bffbae6-69e3-46b7-bb39-af321868f9d8"). InnerVolumeSpecName "kube-api-access-swrhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.521186 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bffbae6-69e3-46b7-bb39-af321868f9d8" (UID: "3bffbae6-69e3-46b7-bb39-af321868f9d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.595237 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.595297 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bffbae6-69e3-46b7-bb39-af321868f9d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:42:57 crc kubenswrapper[4717]: I0308 05:42:57.595318 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swrhg\" (UniqueName: \"kubernetes.io/projected/3bffbae6-69e3-46b7-bb39-af321868f9d8-kube-api-access-swrhg\") on node \"crc\" DevicePath \"\"" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.105499 4717 generic.go:334] "Generic (PLEG): container finished" podID="804f0686-e4ef-4cd6-bbe2-a2e7788759e2" containerID="26b8f022b2edb20558512793dd2d378274b56f049084ad96ee0c7a9a0398c9d0" exitCode=0 Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.106095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerDied","Data":"26b8f022b2edb20558512793dd2d378274b56f049084ad96ee0c7a9a0398c9d0"} Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.110109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6v47" event={"ID":"3bffbae6-69e3-46b7-bb39-af321868f9d8","Type":"ContainerDied","Data":"806a8629a212b7103563956afd77f4af172e6cb0fc3cfc5a093bcd4ef0dfa8bc"} Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.110167 4717 scope.go:117] "RemoveContainer" containerID="daf49b0669e331b59755b6f395776b8cb33cfb65596ef849d8cc1df37416246a" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.110372 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6v47" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.113721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" event={"ID":"88b4e6f2-24f0-4e67-ab40-e3621ab5b44f","Type":"ContainerStarted","Data":"ab0b316820b44ac347cbf84914f136b86c5ddb4086e7690ec1a1734f5f0cbb78"} Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.114036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.143365 4717 scope.go:117] "RemoveContainer" containerID="3bfcb1af7dc5a7e5f6d9a27621d5af28548b0a6c6ee77313ad02b6d363137600" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.175623 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" podStartSLOduration=2.008816672 podStartE2EDuration="10.175583921s" podCreationTimestamp="2026-03-08 05:42:48 +0000 UTC" firstStartedPulling="2026-03-08 05:42:49.06190333 +0000 UTC m=+995.979552174" lastFinishedPulling="2026-03-08 05:42:57.228670539 +0000 UTC m=+1004.146319423" observedRunningTime="2026-03-08 05:42:58.16786918 +0000 UTC m=+1005.085518074" watchObservedRunningTime="2026-03-08 05:42:58.175583921 +0000 UTC m=+1005.093232815" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.190247 4717 scope.go:117] "RemoveContainer" containerID="450c6251ec5f401560536e3572697a699e88a427ca7f03bc5b1b9b4f190c54e6" Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.200436 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:58 crc kubenswrapper[4717]: I0308 05:42:58.214834 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6v47"] Mar 08 05:42:59 crc kubenswrapper[4717]: I0308 05:42:59.136329 4717 generic.go:334] "Generic (PLEG): container finished" podID="804f0686-e4ef-4cd6-bbe2-a2e7788759e2" containerID="942f9ebe933feda8149d4dd006acab2e3294214b9df7f30b7bd43265a08a2fd1" exitCode=0 Mar 08 05:42:59 crc kubenswrapper[4717]: I0308 05:42:59.136539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerDied","Data":"942f9ebe933feda8149d4dd006acab2e3294214b9df7f30b7bd43265a08a2fd1"} Mar 08 05:42:59 crc kubenswrapper[4717]: I0308 05:42:59.789370 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" path="/var/lib/kubelet/pods/3bffbae6-69e3-46b7-bb39-af321868f9d8/volumes" Mar 08 05:43:00 crc kubenswrapper[4717]: I0308 05:43:00.152328 4717 generic.go:334] "Generic (PLEG): container finished" podID="804f0686-e4ef-4cd6-bbe2-a2e7788759e2" containerID="948c1f51dac5e547f1369af9058d5fd1f757f83a42019167842e8b9e04da4511" exitCode=0 Mar 08 05:43:00 crc kubenswrapper[4717]: I0308 05:43:00.152395 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerDied","Data":"948c1f51dac5e547f1369af9058d5fd1f757f83a42019167842e8b9e04da4511"} Mar 08 05:43:00 crc kubenswrapper[4717]: I0308 05:43:00.370047 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-27q99" Mar 08 05:43:01 crc kubenswrapper[4717]: I0308 05:43:01.172480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"89d38c820dae0aa444e200de3c832d2de20a7a5e69a150ce6593a2702e565a5f"} Mar 08 05:43:01 crc kubenswrapper[4717]: I0308 05:43:01.173200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"68f9184fce8e102e5c3159d714bc58e8416174c76cbbdfa5c71347276722aaa3"} Mar 08 05:43:01 crc kubenswrapper[4717]: I0308 05:43:01.173262 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"113cbd57d9b8f26b24ce9ffead6567a386d9576fe7b1f2389a74178e70d34f3f"} Mar 08 05:43:01 crc kubenswrapper[4717]: I0308 05:43:01.173285 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"5aaf2f2dc34668c3b117ecd5d6180ae028a1aba1e225dd124de3206b3570b724"} Mar 08 05:43:02 crc kubenswrapper[4717]: I0308 05:43:02.190898 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"c34ecf415f4c9e77926a430198ae633b1530f4b3c9f85a1a0fabd0c23513ff3b"} Mar 08 05:43:02 crc kubenswrapper[4717]: I0308 05:43:02.191305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wqp4w" event={"ID":"804f0686-e4ef-4cd6-bbe2-a2e7788759e2","Type":"ContainerStarted","Data":"ac9df14fa977a609e85f9c893a1f22a27581a97587abd30fc7eb0dbd3ca1499e"} Mar 08 05:43:02 crc kubenswrapper[4717]: I0308 05:43:02.191329 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:43:02 crc kubenswrapper[4717]: I0308 05:43:02.235028 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wqp4w" podStartSLOduration=5.932504068 podStartE2EDuration="14.234997288s" podCreationTimestamp="2026-03-08 05:42:48 +0000 UTC" firstStartedPulling="2026-03-08 05:42:48.941438185 +0000 UTC m=+995.859087029" lastFinishedPulling="2026-03-08 05:42:57.243931405 +0000 UTC m=+1004.161580249" observedRunningTime="2026-03-08 05:43:02.223505365 +0000 UTC m=+1009.141154249" watchObservedRunningTime="2026-03-08 05:43:02.234997288 +0000 UTC m=+1009.152646172" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.358992 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:03 crc kubenswrapper[4717]: E0308 05:43:03.359336 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="extract-utilities" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.359353 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="extract-utilities" Mar 08 05:43:03 crc kubenswrapper[4717]: E0308 05:43:03.359372 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="registry-server" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.359380 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="registry-server" Mar 08 05:43:03 crc kubenswrapper[4717]: E0308 05:43:03.359413 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="extract-content" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.359422 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="extract-content" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.359574 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bffbae6-69e3-46b7-bb39-af321868f9d8" containerName="registry-server" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.360188 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.366017 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.373137 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.377951 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.511319 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgfv\" (UniqueName: \"kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv\") pod \"openstack-operator-index-rxv9t\" (UID: \"9ac2c000-b662-4546-8867-285019d35088\") " pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.613483 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgfv\" (UniqueName: \"kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv\") pod \"openstack-operator-index-rxv9t\" (UID: \"9ac2c000-b662-4546-8867-285019d35088\") " pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.636840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgfv\" (UniqueName: \"kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv\") pod \"openstack-operator-index-rxv9t\" (UID: \"9ac2c000-b662-4546-8867-285019d35088\") " pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.724459 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.757450 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:43:03 crc kubenswrapper[4717]: I0308 05:43:03.825542 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:43:04 crc kubenswrapper[4717]: I0308 05:43:04.120442 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:43:04 crc kubenswrapper[4717]: I0308 05:43:04.120585 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:43:04 crc kubenswrapper[4717]: I0308 05:43:04.274794 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:05 crc kubenswrapper[4717]: I0308 05:43:05.224898 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rxv9t" event={"ID":"9ac2c000-b662-4546-8867-285019d35088","Type":"ContainerStarted","Data":"3696caa6bbee67a9494526d58e9db6e45b4c9288e4e0e0f499c96bf4bae1571b"} Mar 08 05:43:05 crc kubenswrapper[4717]: I0308 05:43:05.699978 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.103413 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bdxpf"] Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.104458 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.110053 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-52zs5" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.121253 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bdxpf"] Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.173180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7vj\" (UniqueName: \"kubernetes.io/projected/33dbebe0-8cce-49d5-afc5-287c2c188438-kube-api-access-6f7vj\") pod \"openstack-operator-index-bdxpf\" (UID: \"33dbebe0-8cce-49d5-afc5-287c2c188438\") " pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.238246 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rxv9t" event={"ID":"9ac2c000-b662-4546-8867-285019d35088","Type":"ContainerStarted","Data":"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba"} Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.238439 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rxv9t" podUID="9ac2c000-b662-4546-8867-285019d35088" containerName="registry-server" containerID="cri-o://83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba" gracePeriod=2 Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.265574 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rxv9t" podStartSLOduration=1.82005379 podStartE2EDuration="3.265528963s" podCreationTimestamp="2026-03-08 05:43:03 +0000 UTC" firstStartedPulling="2026-03-08 05:43:04.275498023 +0000 UTC m=+1011.193146877" lastFinishedPulling="2026-03-08 05:43:05.720973166 +0000 UTC m=+1012.638622050" observedRunningTime="2026-03-08 05:43:06.260867728 +0000 UTC m=+1013.178516612" watchObservedRunningTime="2026-03-08 05:43:06.265528963 +0000 UTC m=+1013.183177847" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.274291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7vj\" (UniqueName: \"kubernetes.io/projected/33dbebe0-8cce-49d5-afc5-287c2c188438-kube-api-access-6f7vj\") pod \"openstack-operator-index-bdxpf\" (UID: \"33dbebe0-8cce-49d5-afc5-287c2c188438\") " pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.302986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7vj\" (UniqueName: \"kubernetes.io/projected/33dbebe0-8cce-49d5-afc5-287c2c188438-kube-api-access-6f7vj\") pod \"openstack-operator-index-bdxpf\" (UID: \"33dbebe0-8cce-49d5-afc5-287c2c188438\") " pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.461029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.697583 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.749476 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bdxpf"] Mar 08 05:43:06 crc kubenswrapper[4717]: W0308 05:43:06.751882 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33dbebe0_8cce_49d5_afc5_287c2c188438.slice/crio-e42634aa64db62a628ceb9cd8582ddd60cb516ea7cceda176edc57a963ac5594 WatchSource:0}: Error finding container e42634aa64db62a628ceb9cd8582ddd60cb516ea7cceda176edc57a963ac5594: Status 404 returned error can't find the container with id e42634aa64db62a628ceb9cd8582ddd60cb516ea7cceda176edc57a963ac5594 Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.783030 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psgfv\" (UniqueName: \"kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv\") pod \"9ac2c000-b662-4546-8867-285019d35088\" (UID: \"9ac2c000-b662-4546-8867-285019d35088\") " Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.791932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv" (OuterVolumeSpecName: "kube-api-access-psgfv") pod "9ac2c000-b662-4546-8867-285019d35088" (UID: "9ac2c000-b662-4546-8867-285019d35088"). InnerVolumeSpecName "kube-api-access-psgfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:43:06 crc kubenswrapper[4717]: I0308 05:43:06.886284 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psgfv\" (UniqueName: \"kubernetes.io/projected/9ac2c000-b662-4546-8867-285019d35088-kube-api-access-psgfv\") on node \"crc\" DevicePath \"\"" Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.250858 4717 generic.go:334] "Generic (PLEG): container finished" podID="9ac2c000-b662-4546-8867-285019d35088" containerID="83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba" exitCode=0 Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.250949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rxv9t" event={"ID":"9ac2c000-b662-4546-8867-285019d35088","Type":"ContainerDied","Data":"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba"} Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.250963 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rxv9t" Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.251004 4717 scope.go:117] "RemoveContainer" containerID="83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba" Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.250987 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rxv9t" event={"ID":"9ac2c000-b662-4546-8867-285019d35088","Type":"ContainerDied","Data":"3696caa6bbee67a9494526d58e9db6e45b4c9288e4e0e0f499c96bf4bae1571b"} Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.254444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bdxpf" event={"ID":"33dbebe0-8cce-49d5-afc5-287c2c188438","Type":"ContainerStarted","Data":"e42634aa64db62a628ceb9cd8582ddd60cb516ea7cceda176edc57a963ac5594"} Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.298280 4717 scope.go:117] "RemoveContainer" containerID="83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba" Mar 08 05:43:07 crc kubenswrapper[4717]: E0308 05:43:07.300333 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba\": container with ID starting with 83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba not found: ID does not exist" containerID="83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba" Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.300413 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba"} err="failed to get container status \"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba\": rpc error: code = NotFound desc = could not find container \"83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba\": container with ID starting with 83671f6b5ed440cb73622130947efd2e6b6774ed7c4702ce20890728dcd968ba not found: ID does not exist" Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.306074 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.310824 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rxv9t"] Mar 08 05:43:07 crc kubenswrapper[4717]: I0308 05:43:07.795993 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac2c000-b662-4546-8867-285019d35088" path="/var/lib/kubelet/pods/9ac2c000-b662-4546-8867-285019d35088/volumes" Mar 08 05:43:08 crc kubenswrapper[4717]: I0308 05:43:08.266327 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bdxpf" event={"ID":"33dbebe0-8cce-49d5-afc5-287c2c188438","Type":"ContainerStarted","Data":"656b01e3ca36fb0984baeb73658d92b57f93b94f70b63e63634d0c63b82ca376"} Mar 08 05:43:08 crc kubenswrapper[4717]: I0308 05:43:08.287705 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bdxpf" podStartSLOduration=1.816184882 podStartE2EDuration="2.287661804s" podCreationTimestamp="2026-03-08 05:43:06 +0000 UTC" firstStartedPulling="2026-03-08 05:43:06.756110907 +0000 UTC m=+1013.673759741" lastFinishedPulling="2026-03-08 05:43:07.227587779 +0000 UTC m=+1014.145236663" observedRunningTime="2026-03-08 05:43:08.286447914 +0000 UTC m=+1015.204096788" watchObservedRunningTime="2026-03-08 05:43:08.287661804 +0000 UTC m=+1015.205310648" Mar 08 05:43:08 crc kubenswrapper[4717]: I0308 05:43:08.772792 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vwtjg" Mar 08 05:43:08 crc kubenswrapper[4717]: I0308 05:43:08.894947 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-96ksw" Mar 08 05:43:16 crc kubenswrapper[4717]: I0308 05:43:16.462371 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:16 crc kubenswrapper[4717]: I0308 05:43:16.463486 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:16 crc kubenswrapper[4717]: I0308 05:43:16.502837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:17 crc kubenswrapper[4717]: I0308 05:43:17.412488 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-bdxpf" Mar 08 05:43:18 crc kubenswrapper[4717]: I0308 05:43:18.762791 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wqp4w" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.774639 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8"] Mar 08 05:43:29 crc kubenswrapper[4717]: E0308 05:43:29.776045 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac2c000-b662-4546-8867-285019d35088" containerName="registry-server" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.776072 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac2c000-b662-4546-8867-285019d35088" containerName="registry-server" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.776334 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac2c000-b662-4546-8867-285019d35088" containerName="registry-server" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.779278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.786444 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j624s" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.809588 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8"] Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.838809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6sg\" (UniqueName: \"kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.839080 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.839212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.940561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.940640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.940784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6sg\" (UniqueName: \"kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.941646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.941775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:29 crc kubenswrapper[4717]: I0308 05:43:29.980934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6sg\" (UniqueName: \"kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:30 crc kubenswrapper[4717]: I0308 05:43:30.116524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:30 crc kubenswrapper[4717]: I0308 05:43:30.635785 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8"] Mar 08 05:43:31 crc kubenswrapper[4717]: I0308 05:43:31.523224 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerID="cc9c2c98473993419e406d0bd70440d7706608a429a064ab0e0f89f629422758" exitCode=0 Mar 08 05:43:31 crc kubenswrapper[4717]: I0308 05:43:31.523381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerDied","Data":"cc9c2c98473993419e406d0bd70440d7706608a429a064ab0e0f89f629422758"} Mar 08 05:43:31 crc kubenswrapper[4717]: I0308 05:43:31.523827 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerStarted","Data":"1b7f1a7f4318c2f0f56d2f3750ba3a2dd16abcc213d628a42e997eb685ec2000"} Mar 08 05:43:32 crc kubenswrapper[4717]: I0308 05:43:32.539249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerStarted","Data":"104e34651130bca82f531faad6bf16293442d76ce5d4a94f6448120011a32186"} Mar 08 05:43:33 crc kubenswrapper[4717]: I0308 05:43:33.553777 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerID="104e34651130bca82f531faad6bf16293442d76ce5d4a94f6448120011a32186" exitCode=0 Mar 08 05:43:33 crc kubenswrapper[4717]: I0308 05:43:33.553848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerDied","Data":"104e34651130bca82f531faad6bf16293442d76ce5d4a94f6448120011a32186"} Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.119919 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.120523 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.120609 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.121815 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.121934 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5" gracePeriod=600 Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.568281 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5" exitCode=0 Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.568405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5"} Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.568549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b"} Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.568602 4717 scope.go:117] "RemoveContainer" containerID="cf6a478def8e3551d842bb82fdd1ad06931612308caf68537b97f44e9f97c812" Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.573541 4717 generic.go:334] "Generic (PLEG): container finished" podID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerID="b8a1c3c8e198ffea5b166077183db93e216896cf99ec4b402fddf656a6995fe9" exitCode=0 Mar 08 05:43:34 crc kubenswrapper[4717]: I0308 05:43:34.573603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerDied","Data":"b8a1c3c8e198ffea5b166077183db93e216896cf99ec4b402fddf656a6995fe9"} Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.001091 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.086553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util\") pod \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.086731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle\") pod \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.086866 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s6sg\" (UniqueName: \"kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg\") pod \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\" (UID: \"c6e8615c-2151-4a0a-93e3-0638f91ab76c\") " Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.088044 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle" (OuterVolumeSpecName: "bundle") pod "c6e8615c-2151-4a0a-93e3-0638f91ab76c" (UID: "c6e8615c-2151-4a0a-93e3-0638f91ab76c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.102129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg" (OuterVolumeSpecName: "kube-api-access-9s6sg") pod "c6e8615c-2151-4a0a-93e3-0638f91ab76c" (UID: "c6e8615c-2151-4a0a-93e3-0638f91ab76c"). InnerVolumeSpecName "kube-api-access-9s6sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.188782 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.188832 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s6sg\" (UniqueName: \"kubernetes.io/projected/c6e8615c-2151-4a0a-93e3-0638f91ab76c-kube-api-access-9s6sg\") on node \"crc\" DevicePath \"\"" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.193344 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util" (OuterVolumeSpecName: "util") pod "c6e8615c-2151-4a0a-93e3-0638f91ab76c" (UID: "c6e8615c-2151-4a0a-93e3-0638f91ab76c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.290883 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6e8615c-2151-4a0a-93e3-0638f91ab76c-util\") on node \"crc\" DevicePath \"\"" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.610212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" event={"ID":"c6e8615c-2151-4a0a-93e3-0638f91ab76c","Type":"ContainerDied","Data":"1b7f1a7f4318c2f0f56d2f3750ba3a2dd16abcc213d628a42e997eb685ec2000"} Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.610925 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7f1a7f4318c2f0f56d2f3750ba3a2dd16abcc213d628a42e997eb685ec2000" Mar 08 05:43:36 crc kubenswrapper[4717]: I0308 05:43:36.610350 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.035254 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb"] Mar 08 05:43:40 crc kubenswrapper[4717]: E0308 05:43:40.036293 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="util" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.036310 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="util" Mar 08 05:43:40 crc kubenswrapper[4717]: E0308 05:43:40.036325 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="pull" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.036333 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="pull" Mar 08 05:43:40 crc kubenswrapper[4717]: E0308 05:43:40.036344 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="extract" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.036351 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="extract" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.036465 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e8615c-2151-4a0a-93e3-0638f91ab76c" containerName="extract" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.036994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.040580 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-m9c44" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.135033 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb"] Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.160900 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987qc\" (UniqueName: \"kubernetes.io/projected/003c1f39-7ea2-4391-87f9-875cbdf6e1cc-kube-api-access-987qc\") pod \"openstack-operator-controller-init-6f44f7b99f-4xgmb\" (UID: \"003c1f39-7ea2-4391-87f9-875cbdf6e1cc\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.262810 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-987qc\" (UniqueName: \"kubernetes.io/projected/003c1f39-7ea2-4391-87f9-875cbdf6e1cc-kube-api-access-987qc\") pod \"openstack-operator-controller-init-6f44f7b99f-4xgmb\" (UID: \"003c1f39-7ea2-4391-87f9-875cbdf6e1cc\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.295335 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-987qc\" (UniqueName: \"kubernetes.io/projected/003c1f39-7ea2-4391-87f9-875cbdf6e1cc-kube-api-access-987qc\") pod \"openstack-operator-controller-init-6f44f7b99f-4xgmb\" (UID: \"003c1f39-7ea2-4391-87f9-875cbdf6e1cc\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.358854 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:40 crc kubenswrapper[4717]: I0308 05:43:40.879783 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb"] Mar 08 05:43:41 crc kubenswrapper[4717]: I0308 05:43:41.649976 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" event={"ID":"003c1f39-7ea2-4391-87f9-875cbdf6e1cc","Type":"ContainerStarted","Data":"54fe02abb2636e1949fe04d02eac6222fba6b0a48abdd85d32c351187e11bed9"} Mar 08 05:43:46 crc kubenswrapper[4717]: I0308 05:43:46.687085 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" event={"ID":"003c1f39-7ea2-4391-87f9-875cbdf6e1cc","Type":"ContainerStarted","Data":"4ec998ded9e526e5d95b0c23412f04b7f4dd73a647057caf3c6789090bc29518"} Mar 08 05:43:46 crc kubenswrapper[4717]: I0308 05:43:46.687662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:43:46 crc kubenswrapper[4717]: I0308 05:43:46.734919 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" podStartSLOduration=1.571569271 podStartE2EDuration="6.734884107s" podCreationTimestamp="2026-03-08 05:43:40 +0000 UTC" firstStartedPulling="2026-03-08 05:43:40.893589991 +0000 UTC m=+1047.811238835" lastFinishedPulling="2026-03-08 05:43:46.056904807 +0000 UTC m=+1052.974553671" observedRunningTime="2026-03-08 05:43:46.727365902 +0000 UTC m=+1053.645014746" watchObservedRunningTime="2026-03-08 05:43:46.734884107 +0000 UTC m=+1053.652532951" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.082226 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.084287 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.097093 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.161629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.161839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.161896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsz4\" (UniqueName: \"kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.265503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsz4\" (UniqueName: \"kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.265884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.266015 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.266629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.266894 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.286792 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsz4\" (UniqueName: \"kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4\") pod \"certified-operators-vrxvs\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.410769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:43:51 crc kubenswrapper[4717]: I0308 05:43:51.747469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:43:52 crc kubenswrapper[4717]: I0308 05:43:52.758235 4717 generic.go:334] "Generic (PLEG): container finished" podID="a2011048-4930-42a0-815e-8b21542483b5" containerID="572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009" exitCode=0 Mar 08 05:43:52 crc kubenswrapper[4717]: I0308 05:43:52.758382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerDied","Data":"572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009"} Mar 08 05:43:52 crc kubenswrapper[4717]: I0308 05:43:52.758753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerStarted","Data":"4498ae0a1251f1fbb84b53a1c84e76c81727eb90575b6c289a17b1478a80ca0d"} Mar 08 05:43:53 crc kubenswrapper[4717]: I0308 05:43:53.769452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerStarted","Data":"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82"} Mar 08 05:43:54 crc kubenswrapper[4717]: I0308 05:43:54.787094 4717 generic.go:334] "Generic (PLEG): container finished" podID="a2011048-4930-42a0-815e-8b21542483b5" containerID="492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82" exitCode=0 Mar 08 05:43:54 crc kubenswrapper[4717]: I0308 05:43:54.787347 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerDied","Data":"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82"} Mar 08 05:43:55 crc kubenswrapper[4717]: I0308 05:43:55.806770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerStarted","Data":"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e"} Mar 08 05:43:55 crc kubenswrapper[4717]: I0308 05:43:55.835994 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrxvs" podStartSLOduration=2.371556095 podStartE2EDuration="4.835954007s" podCreationTimestamp="2026-03-08 05:43:51 +0000 UTC" firstStartedPulling="2026-03-08 05:43:52.760963108 +0000 UTC m=+1059.678611982" lastFinishedPulling="2026-03-08 05:43:55.22536102 +0000 UTC m=+1062.143009894" observedRunningTime="2026-03-08 05:43:55.835444915 +0000 UTC m=+1062.753093799" watchObservedRunningTime="2026-03-08 05:43:55.835954007 +0000 UTC m=+1062.753602861" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.138719 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549144-cfx7q"] Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.141358 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.145733 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.150591 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.150645 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.154925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549144-cfx7q"] Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.223113 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmzf\" (UniqueName: \"kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf\") pod \"auto-csr-approver-29549144-cfx7q\" (UID: \"008148d1-0dc4-4b2d-a69c-be8ee1b204c0\") " pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.325278 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbmzf\" (UniqueName: \"kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf\") pod \"auto-csr-approver-29549144-cfx7q\" (UID: \"008148d1-0dc4-4b2d-a69c-be8ee1b204c0\") " pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.363776 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbmzf\" (UniqueName: \"kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf\") pod \"auto-csr-approver-29549144-cfx7q\" (UID: \"008148d1-0dc4-4b2d-a69c-be8ee1b204c0\") " pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.367397 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-4xgmb" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.465536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:00 crc kubenswrapper[4717]: I0308 05:44:00.952472 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549144-cfx7q"] Mar 08 05:44:00 crc kubenswrapper[4717]: W0308 05:44:00.969816 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod008148d1_0dc4_4b2d_a69c_be8ee1b204c0.slice/crio-e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63 WatchSource:0}: Error finding container e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63: Status 404 returned error can't find the container with id e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63 Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.411250 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.412479 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.476291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.863876 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" event={"ID":"008148d1-0dc4-4b2d-a69c-be8ee1b204c0","Type":"ContainerStarted","Data":"e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63"} Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.921762 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:01 crc kubenswrapper[4717]: I0308 05:44:01.983806 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:44:02 crc kubenswrapper[4717]: I0308 05:44:02.878363 4717 generic.go:334] "Generic (PLEG): container finished" podID="008148d1-0dc4-4b2d-a69c-be8ee1b204c0" containerID="cf3fe648c4bcc07f3b254ce82882c1fbe0e5a130e3f05c082e588fda45b0dcfb" exitCode=0 Mar 08 05:44:02 crc kubenswrapper[4717]: I0308 05:44:02.878478 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" event={"ID":"008148d1-0dc4-4b2d-a69c-be8ee1b204c0","Type":"ContainerDied","Data":"cf3fe648c4bcc07f3b254ce82882c1fbe0e5a130e3f05c082e588fda45b0dcfb"} Mar 08 05:44:03 crc kubenswrapper[4717]: I0308 05:44:03.890801 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrxvs" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="registry-server" containerID="cri-o://25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e" gracePeriod=2 Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.269675 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.293990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbmzf\" (UniqueName: \"kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf\") pod \"008148d1-0dc4-4b2d-a69c-be8ee1b204c0\" (UID: \"008148d1-0dc4-4b2d-a69c-be8ee1b204c0\") " Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.310033 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf" (OuterVolumeSpecName: "kube-api-access-lbmzf") pod "008148d1-0dc4-4b2d-a69c-be8ee1b204c0" (UID: "008148d1-0dc4-4b2d-a69c-be8ee1b204c0"). InnerVolumeSpecName "kube-api-access-lbmzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.363633 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.396702 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content\") pod \"a2011048-4930-42a0-815e-8b21542483b5\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.396858 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfsz4\" (UniqueName: \"kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4\") pod \"a2011048-4930-42a0-815e-8b21542483b5\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.396909 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities\") pod \"a2011048-4930-42a0-815e-8b21542483b5\" (UID: \"a2011048-4930-42a0-815e-8b21542483b5\") " Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.397371 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbmzf\" (UniqueName: \"kubernetes.io/projected/008148d1-0dc4-4b2d-a69c-be8ee1b204c0-kube-api-access-lbmzf\") on node \"crc\" DevicePath \"\"" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.398539 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities" (OuterVolumeSpecName: "utilities") pod "a2011048-4930-42a0-815e-8b21542483b5" (UID: "a2011048-4930-42a0-815e-8b21542483b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.402649 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4" (OuterVolumeSpecName: "kube-api-access-nfsz4") pod "a2011048-4930-42a0-815e-8b21542483b5" (UID: "a2011048-4930-42a0-815e-8b21542483b5"). InnerVolumeSpecName "kube-api-access-nfsz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.498368 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.498409 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfsz4\" (UniqueName: \"kubernetes.io/projected/a2011048-4930-42a0-815e-8b21542483b5-kube-api-access-nfsz4\") on node \"crc\" DevicePath \"\"" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.906213 4717 generic.go:334] "Generic (PLEG): container finished" podID="a2011048-4930-42a0-815e-8b21542483b5" containerID="25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e" exitCode=0 Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.906339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerDied","Data":"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e"} Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.906427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrxvs" event={"ID":"a2011048-4930-42a0-815e-8b21542483b5","Type":"ContainerDied","Data":"4498ae0a1251f1fbb84b53a1c84e76c81727eb90575b6c289a17b1478a80ca0d"} Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.906459 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrxvs" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.906465 4717 scope.go:117] "RemoveContainer" containerID="25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.909233 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" event={"ID":"008148d1-0dc4-4b2d-a69c-be8ee1b204c0","Type":"ContainerDied","Data":"e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63"} Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.909324 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e505a53777750349b634d31c74e8664eda2f2b4b1974cf1a56b1ea6ce98d6b63" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.909348 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549144-cfx7q" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.941298 4717 scope.go:117] "RemoveContainer" containerID="492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82" Mar 08 05:44:04 crc kubenswrapper[4717]: I0308 05:44:04.979130 4717 scope.go:117] "RemoveContainer" containerID="572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.004805 4717 scope.go:117] "RemoveContainer" containerID="25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e" Mar 08 05:44:05 crc kubenswrapper[4717]: E0308 05:44:05.005918 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e\": container with ID starting with 25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e not found: ID does not exist" containerID="25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.005999 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e"} err="failed to get container status \"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e\": rpc error: code = NotFound desc = could not find container \"25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e\": container with ID starting with 25d5298023b294e0b7ec04178175ebafec899e72746cd43ee8e2287fcc77020e not found: ID does not exist" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.006054 4717 scope.go:117] "RemoveContainer" containerID="492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82" Mar 08 05:44:05 crc kubenswrapper[4717]: E0308 05:44:05.007460 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82\": container with ID starting with 492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82 not found: ID does not exist" containerID="492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.007526 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82"} err="failed to get container status \"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82\": rpc error: code = NotFound desc = could not find container \"492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82\": container with ID starting with 492be65fb27f86416ee408f163eae4dcaed6efe3c50a1cc370485ad52c9efc82 not found: ID does not exist" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.007572 4717 scope.go:117] "RemoveContainer" containerID="572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009" Mar 08 05:44:05 crc kubenswrapper[4717]: E0308 05:44:05.008219 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009\": container with ID starting with 572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009 not found: ID does not exist" containerID="572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.008270 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009"} err="failed to get container status \"572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009\": rpc error: code = NotFound desc = could not find container \"572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009\": container with ID starting with 572f159af886676220a4d9ff7196ed3c4347a8a36616f77569c162ca6406f009 not found: ID does not exist" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.075315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2011048-4930-42a0-815e-8b21542483b5" (UID: "a2011048-4930-42a0-815e-8b21542483b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.111892 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2011048-4930-42a0-815e-8b21542483b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.245331 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.254574 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrxvs"] Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.328468 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549138-sbrdb"] Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.336940 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549138-sbrdb"] Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.792038 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c2d4c4-71ce-4639-9fa9-5147173cddfb" path="/var/lib/kubelet/pods/15c2d4c4-71ce-4639-9fa9-5147173cddfb/volumes" Mar 08 05:44:05 crc kubenswrapper[4717]: I0308 05:44:05.793545 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2011048-4930-42a0-815e-8b21542483b5" path="/var/lib/kubelet/pods/a2011048-4930-42a0-815e-8b21542483b5/volumes" Mar 08 05:44:24 crc kubenswrapper[4717]: I0308 05:44:24.408629 4717 scope.go:117] "RemoveContainer" containerID="429e00fba7c737713e1649c06b86f985dcd81c81c65621a18de75d7dbbb06b54" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.656238 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt"] Mar 08 05:44:25 crc kubenswrapper[4717]: E0308 05:44:25.666612 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="registry-server" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.666638 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="registry-server" Mar 08 05:44:25 crc kubenswrapper[4717]: E0308 05:44:25.666656 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="extract-content" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.666663 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="extract-content" Mar 08 05:44:25 crc kubenswrapper[4717]: E0308 05:44:25.666716 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008148d1-0dc4-4b2d-a69c-be8ee1b204c0" containerName="oc" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.666724 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="008148d1-0dc4-4b2d-a69c-be8ee1b204c0" containerName="oc" Mar 08 05:44:25 crc kubenswrapper[4717]: E0308 05:44:25.666749 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="extract-utilities" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.666755 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="extract-utilities" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.666981 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2011048-4930-42a0-815e-8b21542483b5" containerName="registry-server" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.667011 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="008148d1-0dc4-4b2d-a69c-be8ee1b204c0" containerName="oc" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.668235 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.675402 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tskl5" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.703143 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.704310 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.709707 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.712141 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.714602 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xk6wd" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.714819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hmzb9" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.741625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.775898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pxr\" (UniqueName: \"kubernetes.io/projected/8f3bb097-82e6-4fe8-ad89-48004c80477b-kube-api-access-22pxr\") pod \"cinder-operator-controller-manager-55d77d7b5c-x5kbt\" (UID: \"8f3bb097-82e6-4fe8-ad89-48004c80477b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.810567 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.810627 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.810647 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.811784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.823340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7dv4r" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.824161 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.825310 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.832785 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.839375 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rn788" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.839602 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.846789 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.848311 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.855411 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qxkbg" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.859759 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.860826 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.870460 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.871648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.871741 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.872187 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t6rdh" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.878938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22pxr\" (UniqueName: \"kubernetes.io/projected/8f3bb097-82e6-4fe8-ad89-48004c80477b-kube-api-access-22pxr\") pod \"cinder-operator-controller-manager-55d77d7b5c-x5kbt\" (UID: \"8f3bb097-82e6-4fe8-ad89-48004c80477b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.879004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2md82\" (UniqueName: \"kubernetes.io/projected/8e10706f-2cf2-4b11-a084-33df5b7fe0a1-kube-api-access-2md82\") pod \"barbican-operator-controller-manager-6db6876945-7x79h\" (UID: \"8e10706f-2cf2-4b11-a084-33df5b7fe0a1\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.879034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrgm\" (UniqueName: \"kubernetes.io/projected/bf98a4b8-6e3c-423d-b228-347c527e6721-kube-api-access-pbrgm\") pod \"designate-operator-controller-manager-5d87c9d997-wdwbm\" (UID: \"bf98a4b8-6e3c-423d-b228-347c527e6721\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.879067 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vs8vd" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.879784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.923713 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.936271 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.937438 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.955785 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.955856 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.958191 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9p98v" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.958323 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj"] Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.959284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.964800 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jh2ss" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990368 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/11010a39-3786-472f-ad04-805c35647afc-kube-api-access-r2rzw\") pod \"glance-operator-controller-manager-64db6967f8-vbdls\" (UID: \"11010a39-3786-472f-ad04-805c35647afc\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6xv\" (UniqueName: \"kubernetes.io/projected/9c64b2a6-6663-46cc-b762-bffa01baeb47-kube-api-access-gt6xv\") pod \"ironic-operator-controller-manager-545456dc4-x44f7\" (UID: \"9c64b2a6-6663-46cc-b762-bffa01baeb47\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990430 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990452 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7bt\" (UniqueName: \"kubernetes.io/projected/9fe70b75-885a-402b-98e1-f5c696e47f48-kube-api-access-nl7bt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-s8gsj\" (UID: \"9fe70b75-885a-402b-98e1-f5c696e47f48\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2md82\" (UniqueName: \"kubernetes.io/projected/8e10706f-2cf2-4b11-a084-33df5b7fe0a1-kube-api-access-2md82\") pod \"barbican-operator-controller-manager-6db6876945-7x79h\" (UID: \"8e10706f-2cf2-4b11-a084-33df5b7fe0a1\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990540 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr46\" (UniqueName: \"kubernetes.io/projected/7eda0f52-4fcf-46fe-b329-075fb4d79c74-kube-api-access-sbr46\") pod \"manila-operator-controller-manager-67d996989d-mp5dj\" (UID: \"7eda0f52-4fcf-46fe-b329-075fb4d79c74\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990564 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrgm\" (UniqueName: \"kubernetes.io/projected/bf98a4b8-6e3c-423d-b228-347c527e6721-kube-api-access-pbrgm\") pod \"designate-operator-controller-manager-5d87c9d997-wdwbm\" (UID: \"bf98a4b8-6e3c-423d-b228-347c527e6721\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5dgt\" (UniqueName: \"kubernetes.io/projected/74fc8d21-150d-4009-b0ba-b6a47db5adbb-kube-api-access-k5dgt\") pod \"heat-operator-controller-manager-cf99c678f-vpqhf\" (UID: \"74fc8d21-150d-4009-b0ba-b6a47db5adbb\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pxr\" (UniqueName: \"kubernetes.io/projected/8f3bb097-82e6-4fe8-ad89-48004c80477b-kube-api-access-22pxr\") pod \"cinder-operator-controller-manager-55d77d7b5c-x5kbt\" (UID: \"8f3bb097-82e6-4fe8-ad89-48004c80477b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990624 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwp5\" (UniqueName: \"kubernetes.io/projected/be14026d-4e86-4134-8f2a-617e9272d2a1-kube-api-access-lmwp5\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:25 crc kubenswrapper[4717]: I0308 05:44:25.990772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5hr5\" (UniqueName: \"kubernetes.io/projected/7668ece6-7b88-4707-baf2-62379071cf43-kube-api-access-q5hr5\") pod \"keystone-operator-controller-manager-7c789f89c6-wmjsb\" (UID: \"7668ece6-7b88-4707-baf2-62379071cf43\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.012998 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.013475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.027546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2md82\" (UniqueName: \"kubernetes.io/projected/8e10706f-2cf2-4b11-a084-33df5b7fe0a1-kube-api-access-2md82\") pod \"barbican-operator-controller-manager-6db6876945-7x79h\" (UID: \"8e10706f-2cf2-4b11-a084-33df5b7fe0a1\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.045100 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.051479 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.051945 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.052775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrgm\" (UniqueName: \"kubernetes.io/projected/bf98a4b8-6e3c-423d-b228-347c527e6721-kube-api-access-pbrgm\") pod \"designate-operator-controller-manager-5d87c9d997-wdwbm\" (UID: \"bf98a4b8-6e3c-423d-b228-347c527e6721\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.056719 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zhdxr" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.070436 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.071052 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.072100 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.074795 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7chhp" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.086024 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbr46\" (UniqueName: \"kubernetes.io/projected/7eda0f52-4fcf-46fe-b329-075fb4d79c74-kube-api-access-sbr46\") pod \"manila-operator-controller-manager-67d996989d-mp5dj\" (UID: \"7eda0f52-4fcf-46fe-b329-075fb4d79c74\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5dgt\" (UniqueName: \"kubernetes.io/projected/74fc8d21-150d-4009-b0ba-b6a47db5adbb-kube-api-access-k5dgt\") pod \"heat-operator-controller-manager-cf99c678f-vpqhf\" (UID: \"74fc8d21-150d-4009-b0ba-b6a47db5adbb\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmwp5\" (UniqueName: \"kubernetes.io/projected/be14026d-4e86-4134-8f2a-617e9272d2a1-kube-api-access-lmwp5\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093928 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5hr5\" (UniqueName: \"kubernetes.io/projected/7668ece6-7b88-4707-baf2-62379071cf43-kube-api-access-q5hr5\") pod \"keystone-operator-controller-manager-7c789f89c6-wmjsb\" (UID: \"7668ece6-7b88-4707-baf2-62379071cf43\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/11010a39-3786-472f-ad04-805c35647afc-kube-api-access-r2rzw\") pod \"glance-operator-controller-manager-64db6967f8-vbdls\" (UID: \"11010a39-3786-472f-ad04-805c35647afc\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.093987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6xv\" (UniqueName: \"kubernetes.io/projected/9c64b2a6-6663-46cc-b762-bffa01baeb47-kube-api-access-gt6xv\") pod \"ironic-operator-controller-manager-545456dc4-x44f7\" (UID: \"9c64b2a6-6663-46cc-b762-bffa01baeb47\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.094006 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.094026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7bt\" (UniqueName: \"kubernetes.io/projected/9fe70b75-885a-402b-98e1-f5c696e47f48-kube-api-access-nl7bt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-s8gsj\" (UID: \"9fe70b75-885a-402b-98e1-f5c696e47f48\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.094061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq9xn\" (UniqueName: \"kubernetes.io/projected/148c1a2c-7098-4111-a12e-02e2dcc295a6-kube-api-access-sq9xn\") pod \"mariadb-operator-controller-manager-7b6bfb6475-959nd\" (UID: \"148c1a2c-7098-4111-a12e-02e2dcc295a6\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.094087 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npd2b\" (UniqueName: \"kubernetes.io/projected/999e5f1a-4be7-4716-8999-e28027c618b9-kube-api-access-npd2b\") pod \"neutron-operator-controller-manager-54688575f-hl7k4\" (UID: \"999e5f1a-4be7-4716-8999-e28027c618b9\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.094640 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.094720 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:26.594700386 +0000 UTC m=+1093.512349230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.110802 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.128801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.130084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.136611 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-42v2j" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.138837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6xv\" (UniqueName: \"kubernetes.io/projected/9c64b2a6-6663-46cc-b762-bffa01baeb47-kube-api-access-gt6xv\") pod \"ironic-operator-controller-manager-545456dc4-x44f7\" (UID: \"9c64b2a6-6663-46cc-b762-bffa01baeb47\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.153572 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5dgt\" (UniqueName: \"kubernetes.io/projected/74fc8d21-150d-4009-b0ba-b6a47db5adbb-kube-api-access-k5dgt\") pod \"heat-operator-controller-manager-cf99c678f-vpqhf\" (UID: \"74fc8d21-150d-4009-b0ba-b6a47db5adbb\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.157377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmwp5\" (UniqueName: \"kubernetes.io/projected/be14026d-4e86-4134-8f2a-617e9272d2a1-kube-api-access-lmwp5\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.189773 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.190542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/11010a39-3786-472f-ad04-805c35647afc-kube-api-access-r2rzw\") pod \"glance-operator-controller-manager-64db6967f8-vbdls\" (UID: \"11010a39-3786-472f-ad04-805c35647afc\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.195009 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq9xn\" (UniqueName: \"kubernetes.io/projected/148c1a2c-7098-4111-a12e-02e2dcc295a6-kube-api-access-sq9xn\") pod \"mariadb-operator-controller-manager-7b6bfb6475-959nd\" (UID: \"148c1a2c-7098-4111-a12e-02e2dcc295a6\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.195055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npd2b\" (UniqueName: \"kubernetes.io/projected/999e5f1a-4be7-4716-8999-e28027c618b9-kube-api-access-npd2b\") pod \"neutron-operator-controller-manager-54688575f-hl7k4\" (UID: \"999e5f1a-4be7-4716-8999-e28027c618b9\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.195112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9k5h\" (UniqueName: \"kubernetes.io/projected/17485954-f1e6-4042-9338-ad5115801764-kube-api-access-q9k5h\") pod \"nova-operator-controller-manager-74b6b5dc96-htv95\" (UID: \"17485954-f1e6-4042-9338-ad5115801764\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.197890 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7bt\" (UniqueName: \"kubernetes.io/projected/9fe70b75-885a-402b-98e1-f5c696e47f48-kube-api-access-nl7bt\") pod \"horizon-operator-controller-manager-78bc7f9bd9-s8gsj\" (UID: \"9fe70b75-885a-402b-98e1-f5c696e47f48\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.204998 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.246127 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.284668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.298151 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbr46\" (UniqueName: \"kubernetes.io/projected/7eda0f52-4fcf-46fe-b329-075fb4d79c74-kube-api-access-sbr46\") pod \"manila-operator-controller-manager-67d996989d-mp5dj\" (UID: \"7eda0f52-4fcf-46fe-b329-075fb4d79c74\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.306668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5hr5\" (UniqueName: \"kubernetes.io/projected/7668ece6-7b88-4707-baf2-62379071cf43-kube-api-access-q5hr5\") pod \"keystone-operator-controller-manager-7c789f89c6-wmjsb\" (UID: \"7668ece6-7b88-4707-baf2-62379071cf43\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.313424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.315500 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-dgdzm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.340495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npd2b\" (UniqueName: \"kubernetes.io/projected/999e5f1a-4be7-4716-8999-e28027c618b9-kube-api-access-npd2b\") pod \"neutron-operator-controller-manager-54688575f-hl7k4\" (UID: \"999e5f1a-4be7-4716-8999-e28027c618b9\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.369380 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq9xn\" (UniqueName: \"kubernetes.io/projected/148c1a2c-7098-4111-a12e-02e2dcc295a6-kube-api-access-sq9xn\") pod \"mariadb-operator-controller-manager-7b6bfb6475-959nd\" (UID: \"148c1a2c-7098-4111-a12e-02e2dcc295a6\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.373373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.396088 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.409674 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.440201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zw2\" (UniqueName: \"kubernetes.io/projected/44e5de82-d168-400e-801f-1f122a08c656-kube-api-access-c9zw2\") pod \"octavia-operator-controller-manager-5d86c7ddb7-cmn95\" (UID: \"44e5de82-d168-400e-801f-1f122a08c656\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.440284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9k5h\" (UniqueName: \"kubernetes.io/projected/17485954-f1e6-4042-9338-ad5115801764-kube-api-access-q9k5h\") pod \"nova-operator-controller-manager-74b6b5dc96-htv95\" (UID: \"17485954-f1e6-4042-9338-ad5115801764\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.441460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.443131 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.453104 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.453343 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pkj7x" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.453670 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.467641 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.490615 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.509788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9k5h\" (UniqueName: \"kubernetes.io/projected/17485954-f1e6-4042-9338-ad5115801764-kube-api-access-q9k5h\") pod \"nova-operator-controller-manager-74b6b5dc96-htv95\" (UID: \"17485954-f1e6-4042-9338-ad5115801764\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.523032 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.534886 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.545452 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jsv7g" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.545727 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.548227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nt5\" (UniqueName: \"kubernetes.io/projected/f09b3f70-1158-4269-abf3-acf3fecc0cb9-kube-api-access-79nt5\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.548364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zw2\" (UniqueName: \"kubernetes.io/projected/44e5de82-d168-400e-801f-1f122a08c656-kube-api-access-c9zw2\") pod \"octavia-operator-controller-manager-5d86c7ddb7-cmn95\" (UID: \"44e5de82-d168-400e-801f-1f122a08c656\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.548413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.590581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zw2\" (UniqueName: \"kubernetes.io/projected/44e5de82-d168-400e-801f-1f122a08c656-kube-api-access-c9zw2\") pod \"octavia-operator-controller-manager-5d86c7ddb7-cmn95\" (UID: \"44e5de82-d168-400e-801f-1f122a08c656\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.603530 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.604913 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.606210 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.606719 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.608910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9j2b4" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.611306 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.623561 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.628674 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.630226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.634739 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.636328 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-h5b7w" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.636537 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.644579 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.644822 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qrkk2" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.645499 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.650214 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nt5\" (UniqueName: \"kubernetes.io/projected/f09b3f70-1158-4269-abf3-acf3fecc0cb9-kube-api-access-79nt5\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.650284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.650364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.650401 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxq79\" (UniqueName: \"kubernetes.io/projected/3cd0ad0a-7a9e-4870-8b76-58f975cd36e4-kube-api-access-jxq79\") pod \"ovn-operator-controller-manager-75684d597f-dfmch\" (UID: \"3cd0ad0a-7a9e-4870-8b76-58f975cd36e4\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.651848 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.651945 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:27.651914976 +0000 UTC m=+1094.569563820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.652556 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: E0308 05:44:26.652598 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:27.152587812 +0000 UTC m=+1094.070236656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.662896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.686531 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.691876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.697918 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5j2g5" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.714313 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nt5\" (UniqueName: \"kubernetes.io/projected/f09b3f70-1158-4269-abf3-acf3fecc0cb9-kube-api-access-79nt5\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.714394 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.735877 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.737271 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.741734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6cr2p" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.756216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxq79\" (UniqueName: \"kubernetes.io/projected/3cd0ad0a-7a9e-4870-8b76-58f975cd36e4-kube-api-access-jxq79\") pod \"ovn-operator-controller-manager-75684d597f-dfmch\" (UID: \"3cd0ad0a-7a9e-4870-8b76-58f975cd36e4\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.761143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrbj\" (UniqueName: \"kubernetes.io/projected/d7c1a0d3-1242-402f-88a3-6d45d4c6661a-kube-api-access-dgrbj\") pod \"swift-operator-controller-manager-9b9ff9f4d-djsmm\" (UID: \"d7c1a0d3-1242-402f-88a3-6d45d4c6661a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.761245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzwp\" (UniqueName: \"kubernetes.io/projected/3689217b-f2db-4d81-8e68-7f728ce20860-kube-api-access-7jzwp\") pod \"telemetry-operator-controller-manager-5fdb694969-6hf5q\" (UID: \"3689217b-f2db-4d81-8e68-7f728ce20860\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.761368 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9bc9\" (UniqueName: \"kubernetes.io/projected/ae7df2ae-3ad9-4c73-a957-fe35b87703ec-kube-api-access-b9bc9\") pod \"placement-operator-controller-manager-648564c9fc-bv6fm\" (UID: \"ae7df2ae-3ad9-4c73-a957-fe35b87703ec\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.803010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxq79\" (UniqueName: \"kubernetes.io/projected/3cd0ad0a-7a9e-4870-8b76-58f975cd36e4-kube-api-access-jxq79\") pod \"ovn-operator-controller-manager-75684d597f-dfmch\" (UID: \"3cd0ad0a-7a9e-4870-8b76-58f975cd36e4\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.805900 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.862652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrbj\" (UniqueName: \"kubernetes.io/projected/d7c1a0d3-1242-402f-88a3-6d45d4c6661a-kube-api-access-dgrbj\") pod \"swift-operator-controller-manager-9b9ff9f4d-djsmm\" (UID: \"d7c1a0d3-1242-402f-88a3-6d45d4c6661a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.862736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzwp\" (UniqueName: \"kubernetes.io/projected/3689217b-f2db-4d81-8e68-7f728ce20860-kube-api-access-7jzwp\") pod \"telemetry-operator-controller-manager-5fdb694969-6hf5q\" (UID: \"3689217b-f2db-4d81-8e68-7f728ce20860\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.862764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2h9\" (UniqueName: \"kubernetes.io/projected/56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5-kube-api-access-ww2h9\") pod \"watcher-operator-controller-manager-bccc79885-hbqjn\" (UID: \"56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.862785 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2gz4\" (UniqueName: \"kubernetes.io/projected/4a173035-b1d9-4435-a2d1-b29e9bea39be-kube-api-access-j2gz4\") pod \"test-operator-controller-manager-55b5ff4dbb-cvjxp\" (UID: \"4a173035-b1d9-4435-a2d1-b29e9bea39be\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.862817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9bc9\" (UniqueName: \"kubernetes.io/projected/ae7df2ae-3ad9-4c73-a957-fe35b87703ec-kube-api-access-b9bc9\") pod \"placement-operator-controller-manager-648564c9fc-bv6fm\" (UID: \"ae7df2ae-3ad9-4c73-a957-fe35b87703ec\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.863489 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.864467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.867270 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.867535 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.867806 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sl58r" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.884987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrbj\" (UniqueName: \"kubernetes.io/projected/d7c1a0d3-1242-402f-88a3-6d45d4c6661a-kube-api-access-dgrbj\") pod \"swift-operator-controller-manager-9b9ff9f4d-djsmm\" (UID: \"d7c1a0d3-1242-402f-88a3-6d45d4c6661a\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.891777 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.900201 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzwp\" (UniqueName: \"kubernetes.io/projected/3689217b-f2db-4d81-8e68-7f728ce20860-kube-api-access-7jzwp\") pod \"telemetry-operator-controller-manager-5fdb694969-6hf5q\" (UID: \"3689217b-f2db-4d81-8e68-7f728ce20860\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.901958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.905400 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9bc9\" (UniqueName: \"kubernetes.io/projected/ae7df2ae-3ad9-4c73-a957-fe35b87703ec-kube-api-access-b9bc9\") pod \"placement-operator-controller-manager-648564c9fc-bv6fm\" (UID: \"ae7df2ae-3ad9-4c73-a957-fe35b87703ec\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.945364 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.952568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.953898 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.957376 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl"] Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.957388 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g9r2m" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.957893 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.965278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6dk\" (UniqueName: \"kubernetes.io/projected/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-kube-api-access-qz6dk\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.970364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2h9\" (UniqueName: \"kubernetes.io/projected/56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5-kube-api-access-ww2h9\") pod \"watcher-operator-controller-manager-bccc79885-hbqjn\" (UID: \"56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.970462 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2gz4\" (UniqueName: \"kubernetes.io/projected/4a173035-b1d9-4435-a2d1-b29e9bea39be-kube-api-access-j2gz4\") pod \"test-operator-controller-manager-55b5ff4dbb-cvjxp\" (UID: \"4a173035-b1d9-4435-a2d1-b29e9bea39be\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.970557 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.970757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.984052 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.992138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2gz4\" (UniqueName: \"kubernetes.io/projected/4a173035-b1d9-4435-a2d1-b29e9bea39be-kube-api-access-j2gz4\") pod \"test-operator-controller-manager-55b5ff4dbb-cvjxp\" (UID: \"4a173035-b1d9-4435-a2d1-b29e9bea39be\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.995421 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:26 crc kubenswrapper[4717]: I0308 05:44:26.996129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2h9\" (UniqueName: \"kubernetes.io/projected/56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5-kube-api-access-ww2h9\") pod \"watcher-operator-controller-manager-bccc79885-hbqjn\" (UID: \"56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.008931 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.016504 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.071350 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.074657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6dk\" (UniqueName: \"kubernetes.io/projected/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-kube-api-access-qz6dk\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.074755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kr5\" (UniqueName: \"kubernetes.io/projected/7da8d6da-69ae-4351-a774-20888648eac2-kube-api-access-82kr5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bqgl\" (UID: \"7da8d6da-69ae-4351-a774-20888648eac2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.074796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.074849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.074991 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.075051 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:27.575029634 +0000 UTC m=+1094.492678478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.075149 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.075240 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:27.575211768 +0000 UTC m=+1094.492860612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.107745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6dk\" (UniqueName: \"kubernetes.io/projected/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-kube-api-access-qz6dk\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.167115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" event={"ID":"8e10706f-2cf2-4b11-a084-33df5b7fe0a1","Type":"ContainerStarted","Data":"e83aca52fde5e769c9b21cd1aaff4691acffbb7a7c752da7b1797e6d42d72ea9"} Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.169176 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" event={"ID":"8f3bb097-82e6-4fe8-ad89-48004c80477b","Type":"ContainerStarted","Data":"d2a554cf3d38316fd645e2725c5fc7e8d8afb468acfc91812a502c871649a061"} Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.175200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82kr5\" (UniqueName: \"kubernetes.io/projected/7da8d6da-69ae-4351-a774-20888648eac2-kube-api-access-82kr5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bqgl\" (UID: \"7da8d6da-69ae-4351-a774-20888648eac2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.175299 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.175484 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.175564 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:28.175541506 +0000 UTC m=+1095.093190350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.222549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kr5\" (UniqueName: \"kubernetes.io/projected/7da8d6da-69ae-4351-a774-20888648eac2-kube-api-access-82kr5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bqgl\" (UID: \"7da8d6da-69ae-4351-a774-20888648eac2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.289581 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.305796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.594115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.594532 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.594711 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.594764 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:28.594746217 +0000 UTC m=+1095.512395061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.595136 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.595158 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:28.595151077 +0000 UTC m=+1095.512799921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.696587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.696882 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: E0308 05:44:27.696944 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:29.69692331 +0000 UTC m=+1096.614572154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.854941 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd"] Mar 08 05:44:27 crc kubenswrapper[4717]: W0308 05:44:27.872280 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148c1a2c_7098_4111_a12e_02e2dcc295a6.slice/crio-f9d7dbcbd9e48902fcbdf658cec1e813eb56e09f86e7216676a07f75147ee7dc WatchSource:0}: Error finding container f9d7dbcbd9e48902fcbdf658cec1e813eb56e09f86e7216676a07f75147ee7dc: Status 404 returned error can't find the container with id f9d7dbcbd9e48902fcbdf658cec1e813eb56e09f86e7216676a07f75147ee7dc Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.881226 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.886624 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.892765 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.898868 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls"] Mar 08 05:44:27 crc kubenswrapper[4717]: W0308 05:44:27.914128 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999e5f1a_4be7_4716_8999_e28027c618b9.slice/crio-660a6a743ab675aac31329090a0e901f4120913872b5ae23e536cbeb3fa6e552 WatchSource:0}: Error finding container 660a6a743ab675aac31329090a0e901f4120913872b5ae23e536cbeb3fa6e552: Status 404 returned error can't find the container with id 660a6a743ab675aac31329090a0e901f4120913872b5ae23e536cbeb3fa6e552 Mar 08 05:44:27 crc kubenswrapper[4717]: W0308 05:44:27.915369 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11010a39_3786_472f_ad04_805c35647afc.slice/crio-bce393c2956d3a07711c044f4a182d6d4e99581bd63d1932663effcfdd348c43 WatchSource:0}: Error finding container bce393c2956d3a07711c044f4a182d6d4e99581bd63d1932663effcfdd348c43: Status 404 returned error can't find the container with id bce393c2956d3a07711c044f4a182d6d4e99581bd63d1932663effcfdd348c43 Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.916111 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.927132 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb"] Mar 08 05:44:27 crc kubenswrapper[4717]: I0308 05:44:27.940970 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7"] Mar 08 05:44:27 crc kubenswrapper[4717]: W0308 05:44:27.949297 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c64b2a6_6663_46cc_b762_bffa01baeb47.slice/crio-809ee8482d322c77d935e28ae8598908911936cd5288e8e6d06eb8cb16335658 WatchSource:0}: Error finding container 809ee8482d322c77d935e28ae8598908911936cd5288e8e6d06eb8cb16335658: Status 404 returned error can't find the container with id 809ee8482d322c77d935e28ae8598908911936cd5288e8e6d06eb8cb16335658 Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.194733 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" event={"ID":"9c64b2a6-6663-46cc-b762-bffa01baeb47","Type":"ContainerStarted","Data":"809ee8482d322c77d935e28ae8598908911936cd5288e8e6d06eb8cb16335658"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.196186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" event={"ID":"bf98a4b8-6e3c-423d-b228-347c527e6721","Type":"ContainerStarted","Data":"59448e8ac7bc6ad513e51c780b8c5e33000542e23c63b72cd536882aa087573a"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.198533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" event={"ID":"148c1a2c-7098-4111-a12e-02e2dcc295a6","Type":"ContainerStarted","Data":"f9d7dbcbd9e48902fcbdf658cec1e813eb56e09f86e7216676a07f75147ee7dc"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.207747 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" event={"ID":"7eda0f52-4fcf-46fe-b329-075fb4d79c74","Type":"ContainerStarted","Data":"3d90d98cc4dcdac819e3417162512e075f26ce141a99f617075205f53f89e39a"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.207807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.207965 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.208041 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:30.20801513 +0000 UTC m=+1097.125663974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.210661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" event={"ID":"999e5f1a-4be7-4716-8999-e28027c618b9","Type":"ContainerStarted","Data":"660a6a743ab675aac31329090a0e901f4120913872b5ae23e536cbeb3fa6e552"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.212347 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" event={"ID":"74fc8d21-150d-4009-b0ba-b6a47db5adbb","Type":"ContainerStarted","Data":"3b3904294390cc26a2cbc619872a82130ee3d857a1c00500068f5cb40ae5fa9d"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.213563 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" event={"ID":"9fe70b75-885a-402b-98e1-f5c696e47f48","Type":"ContainerStarted","Data":"f7e2b71e1908b79b9daa4d2813cb2460c75dd17a9798ef0a3d8afdbf5d0ebe48"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.214602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" event={"ID":"11010a39-3786-472f-ad04-805c35647afc","Type":"ContainerStarted","Data":"bce393c2956d3a07711c044f4a182d6d4e99581bd63d1932663effcfdd348c43"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.216871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" event={"ID":"7668ece6-7b88-4707-baf2-62379071cf43","Type":"ContainerStarted","Data":"07a6f6da9265c5db4d1265c810441786137dd46f5d05a0d256205853603397c4"} Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.291409 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.315706 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.323508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.337005 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.342639 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.351058 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q"] Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.363049 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch"] Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.367620 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxq79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-dfmch_openstack-operators(3cd0ad0a-7a9e-4870-8b76-58f975cd36e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.368713 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ww2h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-hbqjn_openstack-operators(56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.368795 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" podUID="3cd0ad0a-7a9e-4870-8b76-58f975cd36e4" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.369800 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" podUID="56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5" Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.373590 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn"] Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.377192 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9bc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-bv6fm_openstack-operators(ae7df2ae-3ad9-4c73-a957-fe35b87703ec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.378591 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" podUID="ae7df2ae-3ad9-4c73-a957-fe35b87703ec" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.380008 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c9zw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-cmn95_openstack-operators(44e5de82-d168-400e-801f-1f122a08c656): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.380084 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95"] Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.381444 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" podUID="44e5de82-d168-400e-801f-1f122a08c656" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.383169 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jzwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-6hf5q_openstack-operators(3689217b-f2db-4d81-8e68-7f728ce20860): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.385007 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" podUID="3689217b-f2db-4d81-8e68-7f728ce20860" Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.620628 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:28 crc kubenswrapper[4717]: I0308 05:44:28.620737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.620953 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.621004 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.621034 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:30.621013529 +0000 UTC m=+1097.538662373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:28 crc kubenswrapper[4717]: E0308 05:44:28.621126 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:30.621099961 +0000 UTC m=+1097.538748805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.237714 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" event={"ID":"d7c1a0d3-1242-402f-88a3-6d45d4c6661a","Type":"ContainerStarted","Data":"c0ec132a0cbe6b3f2434c36a96c2a19731e5c8f39fd233bcec9a86c04a2761aa"} Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.241215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" event={"ID":"17485954-f1e6-4042-9338-ad5115801764","Type":"ContainerStarted","Data":"68cbf6077e1cb1d960dfb860e4ac53706b87b5182a4d90568b87cf8e7c67fce0"} Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.243857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" event={"ID":"7da8d6da-69ae-4351-a774-20888648eac2","Type":"ContainerStarted","Data":"6b1510dad6af08ea7c61833981e02190f06a0934d19c9eec9ba703b3252f09fc"} Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.252893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" event={"ID":"44e5de82-d168-400e-801f-1f122a08c656","Type":"ContainerStarted","Data":"195440d41033f946509a5dd40afe931129b36349c8a39982aef0e6c9d8bfd688"} Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.255893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" event={"ID":"3689217b-f2db-4d81-8e68-7f728ce20860","Type":"ContainerStarted","Data":"8cc5c04afcef769fbee387d25def97a774ba20417529a27fe9c1f0e821cb6089"} Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.256895 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" podUID="44e5de82-d168-400e-801f-1f122a08c656" Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.257739 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" event={"ID":"4a173035-b1d9-4435-a2d1-b29e9bea39be","Type":"ContainerStarted","Data":"b60598c0ffe19f19d15ccbd87db4b34ec0c8a0b3d9777c6c73d1542508c2687c"} Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.258771 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" podUID="3689217b-f2db-4d81-8e68-7f728ce20860" Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.260997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" event={"ID":"ae7df2ae-3ad9-4c73-a957-fe35b87703ec","Type":"ContainerStarted","Data":"08ca4f2aa1bf2bf63f3d0426b2a44f37d8c20958dfb37f39e957172c45f4cd18"} Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.263774 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" podUID="ae7df2ae-3ad9-4c73-a957-fe35b87703ec" Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.264452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" event={"ID":"3cd0ad0a-7a9e-4870-8b76-58f975cd36e4","Type":"ContainerStarted","Data":"6a4d0c379c5c6c1864930ab784952c3e45d7aad8dec8e095cfadd5c3ca326646"} Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.283307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" event={"ID":"56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5","Type":"ContainerStarted","Data":"0cc2eabb666b7ace2dd9820cf1e88be39952bb95605696002d9e10dac73dc448"} Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.307444 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" podUID="3cd0ad0a-7a9e-4870-8b76-58f975cd36e4" Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.307875 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" podUID="56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5" Mar 08 05:44:29 crc kubenswrapper[4717]: I0308 05:44:29.740380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.740761 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:29 crc kubenswrapper[4717]: E0308 05:44:29.740880 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:33.74085099 +0000 UTC m=+1100.658499834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: I0308 05:44:30.248420 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.248636 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.248741 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:34.24872095 +0000 UTC m=+1101.166369794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.295531 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" podUID="3cd0ad0a-7a9e-4870-8b76-58f975cd36e4" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.296799 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" podUID="56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.296935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" podUID="ae7df2ae-3ad9-4c73-a957-fe35b87703ec" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.297037 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" podUID="44e5de82-d168-400e-801f-1f122a08c656" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.300665 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" podUID="3689217b-f2db-4d81-8e68-7f728ce20860" Mar 08 05:44:30 crc kubenswrapper[4717]: I0308 05:44:30.654595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:30 crc kubenswrapper[4717]: I0308 05:44:30.655151 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.654967 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.655316 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.655359 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:34.655332611 +0000 UTC m=+1101.572981455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:30 crc kubenswrapper[4717]: E0308 05:44:30.655391 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:34.655369212 +0000 UTC m=+1101.573018056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:33 crc kubenswrapper[4717]: I0308 05:44:33.813110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:33 crc kubenswrapper[4717]: E0308 05:44:33.813318 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:33 crc kubenswrapper[4717]: E0308 05:44:33.813896 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:41.813846713 +0000 UTC m=+1108.731495557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: I0308 05:44:34.322030 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.322229 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.322335 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:42.322302468 +0000 UTC m=+1109.239951312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: I0308 05:44:34.730110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.730250 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.730348 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:42.730324313 +0000 UTC m=+1109.647973157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: I0308 05:44:34.730453 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.730785 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:34 crc kubenswrapper[4717]: E0308 05:44:34.730949 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:42.730912118 +0000 UTC m=+1109.648560992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:41 crc kubenswrapper[4717]: I0308 05:44:41.890905 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:41 crc kubenswrapper[4717]: E0308 05:44:41.891147 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:41 crc kubenswrapper[4717]: E0308 05:44:41.891984 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert podName:be14026d-4e86-4134-8f2a-617e9272d2a1 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:57.891946175 +0000 UTC m=+1124.809595059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert") pod "infra-operator-controller-manager-f7fcc58b9-vrnlx" (UID: "be14026d-4e86-4134-8f2a-617e9272d2a1") : secret "infra-operator-webhook-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: I0308 05:44:42.402296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.402566 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.403165 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert podName:f09b3f70-1158-4269-abf3-acf3fecc0cb9 nodeName:}" failed. No retries permitted until 2026-03-08 05:44:58.403127288 +0000 UTC m=+1125.320776362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" (UID: "f09b3f70-1158-4269-abf3-acf3fecc0cb9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.746156 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.746558 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbr46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-mp5dj_openstack-operators(7eda0f52-4fcf-46fe-b329-075fb4d79c74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.747884 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" podUID="7eda0f52-4fcf-46fe-b329-075fb4d79c74" Mar 08 05:44:42 crc kubenswrapper[4717]: I0308 05:44:42.811576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:42 crc kubenswrapper[4717]: I0308 05:44:42.811800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.812003 4717 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.812009 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.812082 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:58.812058157 +0000 UTC m=+1125.729707001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "metrics-server-cert" not found Mar 08 05:44:42 crc kubenswrapper[4717]: E0308 05:44:42.812171 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs podName:1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b nodeName:}" failed. No retries permitted until 2026-03-08 05:44:58.812123528 +0000 UTC m=+1125.729772542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-8wwfb" (UID: "1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b") : secret "webhook-server-cert" not found Mar 08 05:44:43 crc kubenswrapper[4717]: E0308 05:44:43.428002 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" podUID="7eda0f52-4fcf-46fe-b329-075fb4d79c74" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.011214 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.011825 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nl7bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-78bc7f9bd9-s8gsj_openstack-operators(9fe70b75-885a-402b-98e1-f5c696e47f48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.013272 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" podUID="9fe70b75-885a-402b-98e1-f5c696e47f48" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.444797 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" podUID="9fe70b75-885a-402b-98e1-f5c696e47f48" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.599391 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.599641 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dgrbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-djsmm_openstack-operators(d7c1a0d3-1242-402f-88a3-6d45d4c6661a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:45 crc kubenswrapper[4717]: E0308 05:44:45.600817 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" podUID="d7c1a0d3-1242-402f-88a3-6d45d4c6661a" Mar 08 05:44:46 crc kubenswrapper[4717]: E0308 05:44:46.238957 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 08 05:44:46 crc kubenswrapper[4717]: E0308 05:44:46.239346 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q5hr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wmjsb_openstack-operators(7668ece6-7b88-4707-baf2-62379071cf43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:46 crc kubenswrapper[4717]: E0308 05:44:46.240639 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" podUID="7668ece6-7b88-4707-baf2-62379071cf43" Mar 08 05:44:46 crc kubenswrapper[4717]: E0308 05:44:46.454118 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" podUID="d7c1a0d3-1242-402f-88a3-6d45d4c6661a" Mar 08 05:44:46 crc kubenswrapper[4717]: E0308 05:44:46.455090 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" podUID="7668ece6-7b88-4707-baf2-62379071cf43" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.294287 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.295587 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82kr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4bqgl_openstack-operators(7da8d6da-69ae-4351-a774-20888648eac2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.296936 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" podUID="7da8d6da-69ae-4351-a774-20888648eac2" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.475124 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" podUID="7da8d6da-69ae-4351-a774-20888648eac2" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.907778 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.908463 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9k5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-htv95_openstack-operators(17485954-f1e6-4042-9338-ad5115801764): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:44:48 crc kubenswrapper[4717]: E0308 05:44:48.910244 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" podUID="17485954-f1e6-4042-9338-ad5115801764" Mar 08 05:44:49 crc kubenswrapper[4717]: E0308 05:44:49.482435 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" podUID="17485954-f1e6-4042-9338-ad5115801764" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.490279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" event={"ID":"8f3bb097-82e6-4fe8-ad89-48004c80477b","Type":"ContainerStarted","Data":"f061d5ae138363f15386f2ac05adba432032d5547beb112241209379d1b2c563"} Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.490633 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.494090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" event={"ID":"4a173035-b1d9-4435-a2d1-b29e9bea39be","Type":"ContainerStarted","Data":"b8c184d23dbb1b0f258bdad6c85a3c2b27f48d2331e823ae6c766387f9593d83"} Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.495122 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.497372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" event={"ID":"9c64b2a6-6663-46cc-b762-bffa01baeb47","Type":"ContainerStarted","Data":"09dd465a79aa2706fec4d0a0c72aca579760f731c1223235036e597269c749c4"} Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.497805 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.507139 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" podStartSLOduration=5.341931318 podStartE2EDuration="25.505970373s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.125121391 +0000 UTC m=+1094.042770235" lastFinishedPulling="2026-03-08 05:44:47.289160446 +0000 UTC m=+1114.206809290" observedRunningTime="2026-03-08 05:44:50.504795144 +0000 UTC m=+1117.422443988" watchObservedRunningTime="2026-03-08 05:44:50.505970373 +0000 UTC m=+1117.423619217" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.523435 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" podStartSLOduration=3.488144388 podStartE2EDuration="25.523408863s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.951499456 +0000 UTC m=+1094.869148300" lastFinishedPulling="2026-03-08 05:44:49.986763931 +0000 UTC m=+1116.904412775" observedRunningTime="2026-03-08 05:44:50.522415549 +0000 UTC m=+1117.440064393" watchObservedRunningTime="2026-03-08 05:44:50.523408863 +0000 UTC m=+1117.441057707" Mar 08 05:44:50 crc kubenswrapper[4717]: I0308 05:44:50.541825 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" podStartSLOduration=6.188440405 podStartE2EDuration="24.541800828s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.367423107 +0000 UTC m=+1095.285071951" lastFinishedPulling="2026-03-08 05:44:46.72078353 +0000 UTC m=+1113.638432374" observedRunningTime="2026-03-08 05:44:50.540315721 +0000 UTC m=+1117.457964565" watchObservedRunningTime="2026-03-08 05:44:50.541800828 +0000 UTC m=+1117.459449672" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.506912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" event={"ID":"11010a39-3786-472f-ad04-805c35647afc","Type":"ContainerStarted","Data":"0e48b86f08df23a96f91ee244e5075c34ca2a11f8783292fff5fd0565903e793"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.507444 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.509011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" event={"ID":"ae7df2ae-3ad9-4c73-a957-fe35b87703ec","Type":"ContainerStarted","Data":"eab1032aa649dcfb79b233cd2a1ea2ad40971969170654867264f562c5b067da"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.509830 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.511329 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" event={"ID":"3cd0ad0a-7a9e-4870-8b76-58f975cd36e4","Type":"ContainerStarted","Data":"935a399c24eda89cfdad543370ee90dab41c596b2fbb0692acd5b519eb3eef31"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.511676 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.513394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" event={"ID":"999e5f1a-4be7-4716-8999-e28027c618b9","Type":"ContainerStarted","Data":"7f237af5cbe4a7614767e8899a7cce6f5034836a7b8e84e1c9dfe2126c43f7d6"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.513765 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.515626 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" event={"ID":"74fc8d21-150d-4009-b0ba-b6a47db5adbb","Type":"ContainerStarted","Data":"dceee132118e96c45a0eebf22649328c2c4075eb76be86dd1e65ac7855da117c"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.515994 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.517459 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" event={"ID":"3689217b-f2db-4d81-8e68-7f728ce20860","Type":"ContainerStarted","Data":"bc4630ca427f1e98ac0cce46503f774c42a672d0c55ed6c211e737290804e47f"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.517797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.519023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" event={"ID":"bf98a4b8-6e3c-423d-b228-347c527e6721","Type":"ContainerStarted","Data":"53b29746c2a766065a8df2893e29974bb94b1bf4bc5a50bf6ff7eed4c5cfbcfe"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.519417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.520611 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" event={"ID":"148c1a2c-7098-4111-a12e-02e2dcc295a6","Type":"ContainerStarted","Data":"f8d322ffdd82fe410b8c984b275868b2fb4a561ee72225c4fb76f085004a6462"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.520971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.522200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" event={"ID":"8e10706f-2cf2-4b11-a084-33df5b7fe0a1","Type":"ContainerStarted","Data":"ef1796416d99e8d384642026130d8452a7198e452202685d7aa2575ce4a776c7"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.522513 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.523845 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" event={"ID":"56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5","Type":"ContainerStarted","Data":"207cd92c2d1c8ed0148ccb1de5a48a40747097bdbf2bc6f95834a5dc0fe1bede"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.524179 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.525631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" event={"ID":"44e5de82-d168-400e-801f-1f122a08c656","Type":"ContainerStarted","Data":"167515f9ef3537200ad49bef8ed4409e180a56d026c874c5f77eeba96c9af8ea"} Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.525948 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.555812 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" podStartSLOduration=3.780656466 podStartE2EDuration="25.555786477s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.377019464 +0000 UTC m=+1095.294668298" lastFinishedPulling="2026-03-08 05:44:50.152149425 +0000 UTC m=+1117.069798309" observedRunningTime="2026-03-08 05:44:51.552610418 +0000 UTC m=+1118.470259262" watchObservedRunningTime="2026-03-08 05:44:51.555786477 +0000 UTC m=+1118.473435321" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.558167 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" podStartSLOduration=5.108222804 podStartE2EDuration="26.558161305s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.927152255 +0000 UTC m=+1094.844801099" lastFinishedPulling="2026-03-08 05:44:49.377090716 +0000 UTC m=+1116.294739600" observedRunningTime="2026-03-08 05:44:51.535065555 +0000 UTC m=+1118.452714399" watchObservedRunningTime="2026-03-08 05:44:51.558161305 +0000 UTC m=+1118.475810149" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.634942 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" podStartSLOduration=4.065516227 podStartE2EDuration="26.63490434s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:26.805850557 +0000 UTC m=+1093.723499401" lastFinishedPulling="2026-03-08 05:44:49.37523866 +0000 UTC m=+1116.292887514" observedRunningTime="2026-03-08 05:44:51.583549902 +0000 UTC m=+1118.501198746" watchObservedRunningTime="2026-03-08 05:44:51.63490434 +0000 UTC m=+1118.552553174" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.639704 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" podStartSLOduration=4.914293936 podStartE2EDuration="26.639696689s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.379738901 +0000 UTC m=+1095.297387745" lastFinishedPulling="2026-03-08 05:44:50.105141624 +0000 UTC m=+1117.022790498" observedRunningTime="2026-03-08 05:44:51.620202347 +0000 UTC m=+1118.537851191" watchObservedRunningTime="2026-03-08 05:44:51.639696689 +0000 UTC m=+1118.557345533" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.688018 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" podStartSLOduration=3.92258377 podStartE2EDuration="25.687995742s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.367442127 +0000 UTC m=+1095.285090971" lastFinishedPulling="2026-03-08 05:44:50.132854099 +0000 UTC m=+1117.050502943" observedRunningTime="2026-03-08 05:44:51.67455873 +0000 UTC m=+1118.592207574" watchObservedRunningTime="2026-03-08 05:44:51.687995742 +0000 UTC m=+1118.605644586" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.690148 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" podStartSLOduration=5.19348021 podStartE2EDuration="26.690141915s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.880873202 +0000 UTC m=+1094.798522046" lastFinishedPulling="2026-03-08 05:44:49.377534897 +0000 UTC m=+1116.295183751" observedRunningTime="2026-03-08 05:44:51.652609558 +0000 UTC m=+1118.570258402" watchObservedRunningTime="2026-03-08 05:44:51.690141915 +0000 UTC m=+1118.607790749" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.707431 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" podStartSLOduration=7.336169152 podStartE2EDuration="26.707402061s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.916584864 +0000 UTC m=+1094.834233708" lastFinishedPulling="2026-03-08 05:44:47.287817783 +0000 UTC m=+1114.205466617" observedRunningTime="2026-03-08 05:44:51.702191802 +0000 UTC m=+1118.619840646" watchObservedRunningTime="2026-03-08 05:44:51.707402061 +0000 UTC m=+1118.625050905" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.726949 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" podStartSLOduration=3.986464477 podStartE2EDuration="25.726924503s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.38295016 +0000 UTC m=+1095.300599004" lastFinishedPulling="2026-03-08 05:44:50.123410186 +0000 UTC m=+1117.041059030" observedRunningTime="2026-03-08 05:44:51.724991285 +0000 UTC m=+1118.642640129" watchObservedRunningTime="2026-03-08 05:44:51.726924503 +0000 UTC m=+1118.644573347" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.747119 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" podStartSLOduration=4.067922526 podStartE2EDuration="26.747095651s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.371295229 +0000 UTC m=+1094.288944073" lastFinishedPulling="2026-03-08 05:44:50.050468354 +0000 UTC m=+1116.968117198" observedRunningTime="2026-03-08 05:44:51.74095968 +0000 UTC m=+1118.658608534" watchObservedRunningTime="2026-03-08 05:44:51.747095651 +0000 UTC m=+1118.664744495" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.764600 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" podStartSLOduration=4.008211055 podStartE2EDuration="25.764578273s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.368479323 +0000 UTC m=+1095.286128167" lastFinishedPulling="2026-03-08 05:44:50.124846541 +0000 UTC m=+1117.042495385" observedRunningTime="2026-03-08 05:44:51.759498238 +0000 UTC m=+1118.677147082" watchObservedRunningTime="2026-03-08 05:44:51.764578273 +0000 UTC m=+1118.682227117" Mar 08 05:44:51 crc kubenswrapper[4717]: I0308 05:44:51.783300 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" podStartSLOduration=5.311324021 podStartE2EDuration="26.783272915s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.905214603 +0000 UTC m=+1094.822863447" lastFinishedPulling="2026-03-08 05:44:49.377163477 +0000 UTC m=+1116.294812341" observedRunningTime="2026-03-08 05:44:51.77578907 +0000 UTC m=+1118.693437914" watchObservedRunningTime="2026-03-08 05:44:51.783272915 +0000 UTC m=+1118.700921759" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.019304 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-x5kbt" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.056197 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-7x79h" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.077547 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-wdwbm" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.309923 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-vpqhf" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.446800 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x44f7" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.470828 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-vbdls" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.498484 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-959nd" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.570826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" event={"ID":"7eda0f52-4fcf-46fe-b329-075fb4d79c74","Type":"ContainerStarted","Data":"9ed86a68b71c6dfbec6e21e272c0bff8b6d289c7e5242e3a2985a326b638af33"} Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.572006 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.590128 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" podStartSLOduration=3.18976092 podStartE2EDuration="31.590109118s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.932587969 +0000 UTC m=+1094.850236813" lastFinishedPulling="2026-03-08 05:44:56.332936157 +0000 UTC m=+1123.250585011" observedRunningTime="2026-03-08 05:44:56.588081688 +0000 UTC m=+1123.505730532" watchObservedRunningTime="2026-03-08 05:44:56.590109118 +0000 UTC m=+1123.507757962" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.610963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-hl7k4" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.648948 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-cmn95" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.908627 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dfmch" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.961882 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-bv6fm" Mar 08 05:44:56 crc kubenswrapper[4717]: I0308 05:44:56.998200 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6hf5q" Mar 08 05:44:57 crc kubenswrapper[4717]: I0308 05:44:57.013013 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-cvjxp" Mar 08 05:44:57 crc kubenswrapper[4717]: I0308 05:44:57.079491 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-hbqjn" Mar 08 05:44:57 crc kubenswrapper[4717]: I0308 05:44:57.926663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:57 crc kubenswrapper[4717]: I0308 05:44:57.939477 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/be14026d-4e86-4134-8f2a-617e9272d2a1-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vrnlx\" (UID: \"be14026d-4e86-4134-8f2a-617e9272d2a1\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.036210 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.413412 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx"] Mar 08 05:44:58 crc kubenswrapper[4717]: W0308 05:44:58.426567 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe14026d_4e86_4134_8f2a_617e9272d2a1.slice/crio-9adbac55c632a3240a480dea958e0548136c235fd34363fc9f50052e20f79d3c WatchSource:0}: Error finding container 9adbac55c632a3240a480dea958e0548136c235fd34363fc9f50052e20f79d3c: Status 404 returned error can't find the container with id 9adbac55c632a3240a480dea958e0548136c235fd34363fc9f50052e20f79d3c Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.441947 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.454314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09b3f70-1158-4269-abf3-acf3fecc0cb9-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n\" (UID: \"f09b3f70-1158-4269-abf3-acf3fecc0cb9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.598038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" event={"ID":"be14026d-4e86-4134-8f2a-617e9272d2a1","Type":"ContainerStarted","Data":"9adbac55c632a3240a480dea958e0548136c235fd34363fc9f50052e20f79d3c"} Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.603510 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" event={"ID":"7668ece6-7b88-4707-baf2-62379071cf43","Type":"ContainerStarted","Data":"0349c741ac69d4778c74fd89c720454a26fa1d65d681a103547838f1e2b7c598"} Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.604308 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.614437 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" event={"ID":"d7c1a0d3-1242-402f-88a3-6d45d4c6661a","Type":"ContainerStarted","Data":"61fa174423952c767473b301c589b99a4492397393dfa08b50844a8a2373e8c2"} Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.614917 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.628599 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" podStartSLOduration=3.271385856 podStartE2EDuration="33.628576506s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.928352475 +0000 UTC m=+1094.846001319" lastFinishedPulling="2026-03-08 05:44:58.285543085 +0000 UTC m=+1125.203191969" observedRunningTime="2026-03-08 05:44:58.620763233 +0000 UTC m=+1125.538412077" watchObservedRunningTime="2026-03-08 05:44:58.628576506 +0000 UTC m=+1125.546225350" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.637024 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" podStartSLOduration=2.679668588 podStartE2EDuration="32.637013495s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.328248209 +0000 UTC m=+1095.245897043" lastFinishedPulling="2026-03-08 05:44:58.285593066 +0000 UTC m=+1125.203241950" observedRunningTime="2026-03-08 05:44:58.636890712 +0000 UTC m=+1125.554539596" watchObservedRunningTime="2026-03-08 05:44:58.637013495 +0000 UTC m=+1125.554662339" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.638397 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.847376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.847519 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.852376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.853001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-8wwfb\" (UID: \"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:58 crc kubenswrapper[4717]: I0308 05:44:58.997642 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.160278 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n"] Mar 08 05:44:59 crc kubenswrapper[4717]: W0308 05:44:59.172063 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf09b3f70_1158_4269_abf3_acf3fecc0cb9.slice/crio-0c01e14c21034f7ad705bd18ed7c9b05e7ace8dbeeb6a76e5c3351a0b638ce40 WatchSource:0}: Error finding container 0c01e14c21034f7ad705bd18ed7c9b05e7ace8dbeeb6a76e5c3351a0b638ce40: Status 404 returned error can't find the container with id 0c01e14c21034f7ad705bd18ed7c9b05e7ace8dbeeb6a76e5c3351a0b638ce40 Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.253136 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb"] Mar 08 05:44:59 crc kubenswrapper[4717]: W0308 05:44:59.268888 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fdfe5d1_1d8b_4016_843f_6ba5703c9f6b.slice/crio-408242b219c3939caac0bab97181dc2915811d13c6ba9a0fa009e4fdc530e580 WatchSource:0}: Error finding container 408242b219c3939caac0bab97181dc2915811d13c6ba9a0fa009e4fdc530e580: Status 404 returned error can't find the container with id 408242b219c3939caac0bab97181dc2915811d13c6ba9a0fa009e4fdc530e580 Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.632140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" event={"ID":"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b","Type":"ContainerStarted","Data":"da2541791460a9de1c5fa702005f2b8a26b47017f100f64b6b2c85514db3bfe7"} Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.632258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" event={"ID":"1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b","Type":"ContainerStarted","Data":"408242b219c3939caac0bab97181dc2915811d13c6ba9a0fa009e4fdc530e580"} Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.632348 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.634708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" event={"ID":"f09b3f70-1158-4269-abf3-acf3fecc0cb9","Type":"ContainerStarted","Data":"0c01e14c21034f7ad705bd18ed7c9b05e7ace8dbeeb6a76e5c3351a0b638ce40"} Mar 08 05:44:59 crc kubenswrapper[4717]: I0308 05:44:59.668002 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" podStartSLOduration=33.667974024 podStartE2EDuration="33.667974024s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:44:59.667139963 +0000 UTC m=+1126.584788797" watchObservedRunningTime="2026-03-08 05:44:59.667974024 +0000 UTC m=+1126.585622868" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.136940 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz"] Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.139243 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.142075 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.142484 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.145148 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz"] Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.270230 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6kb5\" (UniqueName: \"kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.270284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.270335 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.371923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6kb5\" (UniqueName: \"kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.371975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.372047 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.373622 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.378996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.389108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6kb5\" (UniqueName: \"kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5\") pod \"collect-profiles-29549145-xlqmz\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:00 crc kubenswrapper[4717]: I0308 05:45:00.464276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.034988 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz"] Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.654458 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" event={"ID":"d730563b-5163-4a57-bc85-85d59e25a6ed","Type":"ContainerStarted","Data":"e071d145a852f99a26051262453c8cc0542648cec3eaf06768ad02c7f4c2588f"} Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.656605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" event={"ID":"9fe70b75-885a-402b-98e1-f5c696e47f48","Type":"ContainerStarted","Data":"8d070c70ffb622683bbab0f9f3791aff935ce628ce43c75e508c5cbf9d9ca42b"} Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.656914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.658354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" event={"ID":"be14026d-4e86-4134-8f2a-617e9272d2a1","Type":"ContainerStarted","Data":"c78f9f4f9717798036572e1c86865a6e2d650003844dfa1df93e7555b6ea56a8"} Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.658560 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:45:01 crc kubenswrapper[4717]: I0308 05:45:01.676000 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" podStartSLOduration=3.804975054 podStartE2EDuration="36.675968871s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:27.895458013 +0000 UTC m=+1094.813106857" lastFinishedPulling="2026-03-08 05:45:00.76645182 +0000 UTC m=+1127.684100674" observedRunningTime="2026-03-08 05:45:01.675210052 +0000 UTC m=+1128.592858906" watchObservedRunningTime="2026-03-08 05:45:01.675968871 +0000 UTC m=+1128.593617725" Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.670851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" event={"ID":"f09b3f70-1158-4269-abf3-acf3fecc0cb9","Type":"ContainerStarted","Data":"a1133f0db67494735b368e6a5c1194616601e38e5e369d505b8cf26e9d2d382b"} Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.671387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.675074 4717 generic.go:334] "Generic (PLEG): container finished" podID="d730563b-5163-4a57-bc85-85d59e25a6ed" containerID="8bb44e2cd61646c483ba46e5c007bee41625b60eb68211d01d5e4f47c260d0f2" exitCode=0 Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.675187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" event={"ID":"d730563b-5163-4a57-bc85-85d59e25a6ed","Type":"ContainerDied","Data":"8bb44e2cd61646c483ba46e5c007bee41625b60eb68211d01d5e4f47c260d0f2"} Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.680775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" event={"ID":"17485954-f1e6-4042-9338-ad5115801764","Type":"ContainerStarted","Data":"b24156856fade763a796da20bc5fdf6959fd4f12af0be964f82f71f0a7e401d9"} Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.681405 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.711204 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" podStartSLOduration=35.187740648 podStartE2EDuration="37.711156253s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:59.179384608 +0000 UTC m=+1126.097033462" lastFinishedPulling="2026-03-08 05:45:01.702800223 +0000 UTC m=+1128.620449067" observedRunningTime="2026-03-08 05:45:02.709937683 +0000 UTC m=+1129.627586537" watchObservedRunningTime="2026-03-08 05:45:02.711156253 +0000 UTC m=+1129.628805147" Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.717513 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" podStartSLOduration=35.349433281 podStartE2EDuration="37.717483689s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:58.431242443 +0000 UTC m=+1125.348891327" lastFinishedPulling="2026-03-08 05:45:00.799292851 +0000 UTC m=+1127.716941735" observedRunningTime="2026-03-08 05:45:01.707904079 +0000 UTC m=+1128.625552943" watchObservedRunningTime="2026-03-08 05:45:02.717483689 +0000 UTC m=+1129.635132553" Mar 08 05:45:02 crc kubenswrapper[4717]: I0308 05:45:02.766458 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" podStartSLOduration=4.41156707 podStartE2EDuration="37.766429188s" podCreationTimestamp="2026-03-08 05:44:25 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.348211812 +0000 UTC m=+1095.265860646" lastFinishedPulling="2026-03-08 05:45:01.70307391 +0000 UTC m=+1128.620722764" observedRunningTime="2026-03-08 05:45:02.757122708 +0000 UTC m=+1129.674771562" watchObservedRunningTime="2026-03-08 05:45:02.766429188 +0000 UTC m=+1129.684078042" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.043357 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.144475 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6kb5\" (UniqueName: \"kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5\") pod \"d730563b-5163-4a57-bc85-85d59e25a6ed\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.145440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume\") pod \"d730563b-5163-4a57-bc85-85d59e25a6ed\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.145760 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume\") pod \"d730563b-5163-4a57-bc85-85d59e25a6ed\" (UID: \"d730563b-5163-4a57-bc85-85d59e25a6ed\") " Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.147382 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume" (OuterVolumeSpecName: "config-volume") pod "d730563b-5163-4a57-bc85-85d59e25a6ed" (UID: "d730563b-5163-4a57-bc85-85d59e25a6ed"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.155523 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d730563b-5163-4a57-bc85-85d59e25a6ed" (UID: "d730563b-5163-4a57-bc85-85d59e25a6ed"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.157172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5" (OuterVolumeSpecName: "kube-api-access-c6kb5") pod "d730563b-5163-4a57-bc85-85d59e25a6ed" (UID: "d730563b-5163-4a57-bc85-85d59e25a6ed"). InnerVolumeSpecName "kube-api-access-c6kb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.248674 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6kb5\" (UniqueName: \"kubernetes.io/projected/d730563b-5163-4a57-bc85-85d59e25a6ed-kube-api-access-c6kb5\") on node \"crc\" DevicePath \"\"" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.248743 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d730563b-5163-4a57-bc85-85d59e25a6ed-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.248764 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d730563b-5163-4a57-bc85-85d59e25a6ed-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.712000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" event={"ID":"d730563b-5163-4a57-bc85-85d59e25a6ed","Type":"ContainerDied","Data":"e071d145a852f99a26051262453c8cc0542648cec3eaf06768ad02c7f4c2588f"} Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.712058 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e071d145a852f99a26051262453c8cc0542648cec3eaf06768ad02c7f4c2588f" Mar 08 05:45:04 crc kubenswrapper[4717]: I0308 05:45:04.712114 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz" Mar 08 05:45:06 crc kubenswrapper[4717]: I0308 05:45:06.318273 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-s8gsj" Mar 08 05:45:06 crc kubenswrapper[4717]: I0308 05:45:06.448721 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wmjsb" Mar 08 05:45:06 crc kubenswrapper[4717]: I0308 05:45:06.461056 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-mp5dj" Mar 08 05:45:06 crc kubenswrapper[4717]: I0308 05:45:06.610557 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-htv95" Mar 08 05:45:06 crc kubenswrapper[4717]: I0308 05:45:06.988190 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-djsmm" Mar 08 05:45:08 crc kubenswrapper[4717]: I0308 05:45:08.047213 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vrnlx" Mar 08 05:45:08 crc kubenswrapper[4717]: I0308 05:45:08.648258 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n" Mar 08 05:45:09 crc kubenswrapper[4717]: I0308 05:45:09.008729 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-8wwfb" Mar 08 05:45:09 crc kubenswrapper[4717]: I0308 05:45:09.763054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" event={"ID":"7da8d6da-69ae-4351-a774-20888648eac2","Type":"ContainerStarted","Data":"fea783fc27bae7745c8287fed923d2cf1b7051de09e8f1d643b977509bc97397"} Mar 08 05:45:10 crc kubenswrapper[4717]: I0308 05:45:10.806781 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bqgl" podStartSLOduration=9.882484667 podStartE2EDuration="44.806740089s" podCreationTimestamp="2026-03-08 05:44:26 +0000 UTC" firstStartedPulling="2026-03-08 05:44:28.30719531 +0000 UTC m=+1095.224844154" lastFinishedPulling="2026-03-08 05:45:03.231450682 +0000 UTC m=+1130.149099576" observedRunningTime="2026-03-08 05:45:10.798947047 +0000 UTC m=+1137.716595931" watchObservedRunningTime="2026-03-08 05:45:10.806740089 +0000 UTC m=+1137.724388943" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.501119 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:45:29 crc kubenswrapper[4717]: E0308 05:45:29.502155 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d730563b-5163-4a57-bc85-85d59e25a6ed" containerName="collect-profiles" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.502169 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d730563b-5163-4a57-bc85-85d59e25a6ed" containerName="collect-profiles" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.502298 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d730563b-5163-4a57-bc85-85d59e25a6ed" containerName="collect-profiles" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.503041 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.506311 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.506525 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.506632 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.506653 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xkw2v" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.518173 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.542581 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.546652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.548391 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.567140 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.631547 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.631980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.632034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrbn\" (UniqueName: \"kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.632083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jjs\" (UniqueName: \"kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.632141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.733693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24jjs\" (UniqueName: \"kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.733763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.733868 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.733902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.733950 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrbn\" (UniqueName: \"kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.735134 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.735180 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.735280 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.765192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrbn\" (UniqueName: \"kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn\") pod \"dnsmasq-dns-8468885bfc-bz488\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.765971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jjs\" (UniqueName: \"kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs\") pod \"dnsmasq-dns-545d49fd5c-ph479\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.829439 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:45:29 crc kubenswrapper[4717]: I0308 05:45:29.868801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:45:30 crc kubenswrapper[4717]: I0308 05:45:30.287266 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:45:30 crc kubenswrapper[4717]: I0308 05:45:30.350691 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:45:30 crc kubenswrapper[4717]: W0308 05:45:30.353015 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48e7ebe7_7af6_4e99_bef7_8daaff76b916.slice/crio-57e94399a96729092c441aa6ed8ac1013f1223f358710b0ff99a62991c7b7ce3 WatchSource:0}: Error finding container 57e94399a96729092c441aa6ed8ac1013f1223f358710b0ff99a62991c7b7ce3: Status 404 returned error can't find the container with id 57e94399a96729092c441aa6ed8ac1013f1223f358710b0ff99a62991c7b7ce3 Mar 08 05:45:31 crc kubenswrapper[4717]: I0308 05:45:31.006892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" event={"ID":"48e7ebe7-7af6-4e99-bef7-8daaff76b916","Type":"ContainerStarted","Data":"57e94399a96729092c441aa6ed8ac1013f1223f358710b0ff99a62991c7b7ce3"} Mar 08 05:45:31 crc kubenswrapper[4717]: I0308 05:45:31.010819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-bz488" event={"ID":"7957c7d8-a7bf-4381-be99-a48f6ede8f50","Type":"ContainerStarted","Data":"dac4563fdb7a62c3aa2e371a0c83f099abba407deb1fd359b0070cc06584374b"} Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.321750 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.361898 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.363464 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.369466 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.503695 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl47w\" (UniqueName: \"kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.503841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.503879 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.606407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.606473 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl47w\" (UniqueName: \"kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.606555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.607523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.608079 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.627969 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.634549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl47w\" (UniqueName: \"kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w\") pod \"dnsmasq-dns-b9b4959cc-zgdqn\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.645308 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.646398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.658516 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.694231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.707522 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.707590 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chks\" (UniqueName: \"kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.707620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.813651 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chks\" (UniqueName: \"kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.813731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.813801 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.814664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.822741 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.863439 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chks\" (UniqueName: \"kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks\") pod \"dnsmasq-dns-86b8f4ff9-jtq2p\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.932969 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.933668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.959348 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.960549 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:33 crc kubenswrapper[4717]: I0308 05:45:33.986066 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.118626 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.118780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7j9f\" (UniqueName: \"kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.118819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.119524 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.119565 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.220352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7j9f\" (UniqueName: \"kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.220691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.220769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.221619 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.222320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.242171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7j9f\" (UniqueName: \"kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f\") pod \"dnsmasq-dns-7f7d487d45-ljhqr\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.322130 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.378234 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.516532 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.517857 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.519754 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.523333 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.523385 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.523482 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.523760 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.524057 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-949rb" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.524163 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.524275 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.525404 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:45:34 crc kubenswrapper[4717]: W0308 05:45:34.546470 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb424873e_033e_4970_be30_3481fa57c5fc.slice/crio-ba62510be5a7ba66d0b9541f72e5a7c2dd5de3313489398e77ea49a8561b0136 WatchSource:0}: Error finding container ba62510be5a7ba66d0b9541f72e5a7c2dd5de3313489398e77ea49a8561b0136: Status 404 returned error can't find the container with id ba62510be5a7ba66d0b9541f72e5a7c2dd5de3313489398e77ea49a8561b0136 Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.626900 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.627048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4km\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.627123 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.627288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.627373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.627421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728861 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4km\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728922 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.728998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.729014 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.729028 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.729044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.730527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.731206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.731367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.731689 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.731936 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.733987 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.739684 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.743355 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.756347 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.757498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.786743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4km\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.846775 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.853283 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.861110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " pod="openstack/rabbitmq-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.880961 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.881227 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.881342 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.881433 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lqqnh" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.881603 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.889282 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.889426 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.889983 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.905506 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.932943 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.932989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933012 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933078 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933101 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933118 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933132 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdq2\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933225 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:34 crc kubenswrapper[4717]: I0308 05:45:34.933241 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssdq2\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034683 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.034749 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.035156 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.035441 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.035806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.036230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.036527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.037763 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.041434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.042305 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.043101 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.052631 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssdq2\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.057944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.060442 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.069856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" event={"ID":"8822d5b2-773c-48fa-90e8-09f890b9ca6a","Type":"ContainerStarted","Data":"b2311d8db3ad2289f0e07acd5c1a7cc18aa8544327dc92b51aadf3740b76e0de"} Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.074902 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" event={"ID":"3a759386-18a4-4dd0-9b3c-1dafec1eb845","Type":"ContainerStarted","Data":"5e2ed0da642e15fef4d9baa84757e54dd22ea53d061ae9bf7d3e006190c4cb62"} Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.076774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" event={"ID":"b424873e-033e-4970-be30-3481fa57c5fc","Type":"ContainerStarted","Data":"ba62510be5a7ba66d0b9541f72e5a7c2dd5de3313489398e77ea49a8561b0136"} Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.090878 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.094233 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.100950 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.101120 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.101301 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-dzwx2" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.102946 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.103126 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.103465 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.104575 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.128651 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.144419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.227276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.240830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.240910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.240930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlk2c\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-kube-api-access-rlk2c\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.240971 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.240992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241020 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241137 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.241192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.342953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343002 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343143 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.343264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlk2c\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-kube-api-access-rlk2c\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.344451 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.344820 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.345003 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.345260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.345936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.346470 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.352615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.352678 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.355215 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.358386 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.384532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.427451 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlk2c\" (UniqueName: \"kubernetes.io/projected/f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec-kube-api-access-rlk2c\") pod \"notifications-rabbitmq-server-0\" (UID: \"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec\") " pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.442482 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.813539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:45:35 crc kubenswrapper[4717]: I0308 05:45:35.873966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.043551 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.091167 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerStarted","Data":"d32b6f3314933d1837a2bc28529f090a98ed82bc3e777da9377897b313f4722a"} Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.092618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerStarted","Data":"6e1d70b1123f34f8f3389305426f7651bafa6aca941cf020a0f2f6259c7a8cb4"} Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.099248 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec","Type":"ContainerStarted","Data":"790191260c4e13f5fffe63e9344f7d43c182c68006ca5925294ed7506b3ef03d"} Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.230378 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.232556 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.242814 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.243055 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pcp5f" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.243282 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.243721 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.246953 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.268775 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371395 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-config-data-default\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqspj\" (UniqueName: \"kubernetes.io/projected/739f45be-d031-4f80-9c39-1683ddff1289-kube-api-access-rqspj\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/739f45be-d031-4f80-9c39-1683ddff1289-config-data-generated\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-operator-scripts\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371589 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-kolla-config\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.371621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475726 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-kolla-config\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475887 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-config-data-default\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475984 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqspj\" (UniqueName: \"kubernetes.io/projected/739f45be-d031-4f80-9c39-1683ddff1289-kube-api-access-rqspj\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.475999 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/739f45be-d031-4f80-9c39-1683ddff1289-config-data-generated\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.476072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-operator-scripts\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.478802 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.478844 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/739f45be-d031-4f80-9c39-1683ddff1289-config-data-generated\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.479398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-kolla-config\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.479549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-config-data-default\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.480047 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/739f45be-d031-4f80-9c39-1683ddff1289-operator-scripts\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.488510 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.489015 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739f45be-d031-4f80-9c39-1683ddff1289-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.526839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.543562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqspj\" (UniqueName: \"kubernetes.io/projected/739f45be-d031-4f80-9c39-1683ddff1289-kube-api-access-rqspj\") pod \"openstack-galera-0\" (UID: \"739f45be-d031-4f80-9c39-1683ddff1289\") " pod="openstack/openstack-galera-0" Mar 08 05:45:36 crc kubenswrapper[4717]: I0308 05:45:36.576608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.118830 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 05:45:37 crc kubenswrapper[4717]: W0308 05:45:37.135211 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod739f45be_d031_4f80_9c39_1683ddff1289.slice/crio-a8fafe54587b6bb0a64e577384b43f3d976a41b15cd1d8098aa5a9fd19a146a4 WatchSource:0}: Error finding container a8fafe54587b6bb0a64e577384b43f3d976a41b15cd1d8098aa5a9fd19a146a4: Status 404 returned error can't find the container with id a8fafe54587b6bb0a64e577384b43f3d976a41b15cd1d8098aa5a9fd19a146a4 Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.632295 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.635467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.639985 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.640166 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.640884 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8hsrd" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.641132 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.676030 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srcc9\" (UniqueName: \"kubernetes.io/projected/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kube-api-access-srcc9\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821244 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.821263 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.905433 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.908047 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.909892 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.910069 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-92rdf" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.910396 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.929769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.929840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-config-data\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.929891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srcc9\" (UniqueName: \"kubernetes.io/projected/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kube-api-access-srcc9\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.929912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.929929 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930028 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kolla-config\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ft68\" (UniqueName: \"kubernetes.io/projected/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kube-api-access-4ft68\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930246 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.930275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.931101 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.931724 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.931934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.932328 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.933516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.934921 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.938166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.946664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.967074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srcc9\" (UniqueName: \"kubernetes.io/projected/9e4e6ff9-db68-44fc-a8d2-de9471a74f19-kube-api-access-srcc9\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:37 crc kubenswrapper[4717]: I0308 05:45:37.972032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9e4e6ff9-db68-44fc-a8d2-de9471a74f19\") " pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.037418 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.037471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-config-data\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.037500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.037556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kolla-config\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.037582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ft68\" (UniqueName: \"kubernetes.io/projected/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kube-api-access-4ft68\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.040276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kolla-config\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.040510 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-config-data\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.047322 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.059524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.063276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ft68\" (UniqueName: \"kubernetes.io/projected/ee8a4411-d973-4eeb-b6cd-eb0844e7826e-kube-api-access-4ft68\") pod \"memcached-0\" (UID: \"ee8a4411-d973-4eeb-b6cd-eb0844e7826e\") " pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.131611 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"739f45be-d031-4f80-9c39-1683ddff1289","Type":"ContainerStarted","Data":"a8fafe54587b6bb0a64e577384b43f3d976a41b15cd1d8098aa5a9fd19a146a4"} Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.262367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.330667 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.779456 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 05:45:38 crc kubenswrapper[4717]: W0308 05:45:38.784557 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e4e6ff9_db68_44fc_a8d2_de9471a74f19.slice/crio-95097b9c869fc81a0300b5da6586232a6072cf2aee13817b5d18950bcb97175e WatchSource:0}: Error finding container 95097b9c869fc81a0300b5da6586232a6072cf2aee13817b5d18950bcb97175e: Status 404 returned error can't find the container with id 95097b9c869fc81a0300b5da6586232a6072cf2aee13817b5d18950bcb97175e Mar 08 05:45:38 crc kubenswrapper[4717]: I0308 05:45:38.889623 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 05:45:39 crc kubenswrapper[4717]: I0308 05:45:39.145271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e4e6ff9-db68-44fc-a8d2-de9471a74f19","Type":"ContainerStarted","Data":"95097b9c869fc81a0300b5da6586232a6072cf2aee13817b5d18950bcb97175e"} Mar 08 05:45:39 crc kubenswrapper[4717]: I0308 05:45:39.147334 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ee8a4411-d973-4eeb-b6cd-eb0844e7826e","Type":"ContainerStarted","Data":"4c7de09c60930095ed29182f3f9472c13f22d91340157df2cdc5847b9fc2dd36"} Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.394881 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.396602 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.419505 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sshxv" Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.420157 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.588746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpp5t\" (UniqueName: \"kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t\") pod \"kube-state-metrics-0\" (UID: \"52a5a330-a048-48ff-b195-fc897299b500\") " pod="openstack/kube-state-metrics-0" Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.690445 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpp5t\" (UniqueName: \"kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t\") pod \"kube-state-metrics-0\" (UID: \"52a5a330-a048-48ff-b195-fc897299b500\") " pod="openstack/kube-state-metrics-0" Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.726055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpp5t\" (UniqueName: \"kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t\") pod \"kube-state-metrics-0\" (UID: \"52a5a330-a048-48ff-b195-fc897299b500\") " pod="openstack/kube-state-metrics-0" Mar 08 05:45:40 crc kubenswrapper[4717]: I0308 05:45:40.750677 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.346253 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:45:41 crc kubenswrapper[4717]: W0308 05:45:41.367317 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a5a330_a048_48ff_b195_fc897299b500.slice/crio-7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870 WatchSource:0}: Error finding container 7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870: Status 404 returned error can't find the container with id 7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870 Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.720324 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.723179 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.733438 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.733660 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.733828 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.733933 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.734046 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ckzwm" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.734183 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.736381 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.736395 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.743060 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.908796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnnv\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.908851 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.908881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.908912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.908934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.909094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.909151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.909263 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.909474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:41 crc kubenswrapper[4717]: I0308 05:45:41.909523 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnnv\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010799 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.010965 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.011026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.011051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.011882 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.013230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.013861 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.041227 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.050082 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.050594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.056081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.057469 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.057496 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b9ef073caa6ec1ae4d35eecfe80ee2af5cbcdd85b8b9ead8efa911e24063287d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.058564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.093010 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpnnv\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.265506 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a5a330-a048-48ff-b195-fc897299b500","Type":"ContainerStarted","Data":"7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870"} Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.378964 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:42 crc kubenswrapper[4717]: I0308 05:45:42.660037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.619094 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mcfsn"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.620190 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.622401 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.622928 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.623055 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mhpdz" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.644800 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mcfsn"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.657754 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4c5fb"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.659901 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.686416 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4c5fb"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.765888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-combined-ca-bundle\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.765952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.765980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-run\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-scripts\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766170 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8tz\" (UniqueName: \"kubernetes.io/projected/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-kube-api-access-px8tz\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766225 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-log\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-lib\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766346 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-etc-ovs\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766439 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-log-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-ovn-controller-tls-certs\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766671 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bed90a3-1840-4f1a-a71b-cad45398bd15-scripts\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.766794 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dtz\" (UniqueName: \"kubernetes.io/projected/0bed90a3-1840-4f1a-a71b-cad45398bd15-kube-api-access-s2dtz\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868683 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bed90a3-1840-4f1a-a71b-cad45398bd15-scripts\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dtz\" (UniqueName: \"kubernetes.io/projected/0bed90a3-1840-4f1a-a71b-cad45398bd15-kube-api-access-s2dtz\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-combined-ca-bundle\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-run\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-scripts\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.868983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8tz\" (UniqueName: \"kubernetes.io/projected/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-kube-api-access-px8tz\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-log\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869037 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-lib\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869065 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-etc-ovs\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-log-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-ovn-controller-tls-certs\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869706 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-log\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.869939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-run\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.871580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-etc-ovs\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.871717 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-log-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.871765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.871808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.871844 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0bed90a3-1840-4f1a-a71b-cad45398bd15-var-lib\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.872089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-var-run-ovn\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.873364 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bed90a3-1840-4f1a-a71b-cad45398bd15-scripts\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.881189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-scripts\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.893728 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-ovn-controller-tls-certs\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.895968 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-combined-ca-bundle\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.904476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.906743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.915055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8tz\" (UniqueName: \"kubernetes.io/projected/52c7f6df-8563-4181-bc1e-6fb4c3dd2126-kube-api-access-px8tz\") pod \"ovn-controller-mcfsn\" (UID: \"52c7f6df-8563-4181-bc1e-6fb4c3dd2126\") " pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.916897 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.918189 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.922748 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dtz\" (UniqueName: \"kubernetes.io/projected/0bed90a3-1840-4f1a-a71b-cad45398bd15-kube-api-access-s2dtz\") pod \"ovn-controller-ovs-4c5fb\" (UID: \"0bed90a3-1840-4f1a-a71b-cad45398bd15\") " pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.927637 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.927806 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.928087 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8kpr5" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.966388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn" Mar 08 05:45:43 crc kubenswrapper[4717]: I0308 05:45:43.982296 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073169 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073326 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4v9v\" (UniqueName: \"kubernetes.io/projected/353c8ab9-3710-4290-b5c4-b93339baf4da-kube-api-access-b4v9v\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073341 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-config\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.073492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4v9v\" (UniqueName: \"kubernetes.io/projected/353c8ab9-3710-4290-b5c4-b93339baf4da-kube-api-access-b4v9v\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174829 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174854 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-config\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174913 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.174969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.175001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.175035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.175323 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.177130 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-config\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.177569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353c8ab9-3710-4290-b5c4-b93339baf4da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.178846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.190891 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.192598 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.194336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353c8ab9-3710-4290-b5c4-b93339baf4da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.196034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4v9v\" (UniqueName: \"kubernetes.io/projected/353c8ab9-3710-4290-b5c4-b93339baf4da-kube-api-access-b4v9v\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.231717 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"353c8ab9-3710-4290-b5c4-b93339baf4da\") " pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.333660 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 05:45:44 crc kubenswrapper[4717]: I0308 05:45:44.646787 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:45:44 crc kubenswrapper[4717]: W0308 05:45:44.981890 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd81933c3_0769_427a_a494_9cfd438d269d.slice/crio-ff886074339a045af341a73338f577c36aa46ce7b3fc23dbd9550cd5f8aed868 WatchSource:0}: Error finding container ff886074339a045af341a73338f577c36aa46ce7b3fc23dbd9550cd5f8aed868: Status 404 returned error can't find the container with id ff886074339a045af341a73338f577c36aa46ce7b3fc23dbd9550cd5f8aed868 Mar 08 05:45:45 crc kubenswrapper[4717]: I0308 05:45:45.303424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerStarted","Data":"ff886074339a045af341a73338f577c36aa46ce7b3fc23dbd9550cd5f8aed868"} Mar 08 05:45:45 crc kubenswrapper[4717]: I0308 05:45:45.445316 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mcfsn"] Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.263541 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 05:45:46 crc kubenswrapper[4717]: W0308 05:45:46.327666 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod353c8ab9_3710_4290_b5c4_b93339baf4da.slice/crio-10fd26e84cd51bfc62b5fe9cd921d9bdea5de79ea897611ee92ace8ceac2986a WatchSource:0}: Error finding container 10fd26e84cd51bfc62b5fe9cd921d9bdea5de79ea897611ee92ace8ceac2986a: Status 404 returned error can't find the container with id 10fd26e84cd51bfc62b5fe9cd921d9bdea5de79ea897611ee92ace8ceac2986a Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.335846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a5a330-a048-48ff-b195-fc897299b500","Type":"ContainerStarted","Data":"8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd"} Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.337451 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.343548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn" event={"ID":"52c7f6df-8563-4181-bc1e-6fb4c3dd2126","Type":"ContainerStarted","Data":"fe21415e9e826579f4525adc8df45b9caf37269586dc7ba636d9c43fdd38dede"} Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.365577 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.864096912 podStartE2EDuration="6.365562303s" podCreationTimestamp="2026-03-08 05:45:40 +0000 UTC" firstStartedPulling="2026-03-08 05:45:41.388346434 +0000 UTC m=+1168.305995278" lastFinishedPulling="2026-03-08 05:45:45.889811825 +0000 UTC m=+1172.807460669" observedRunningTime="2026-03-08 05:45:46.362131998 +0000 UTC m=+1173.279780832" watchObservedRunningTime="2026-03-08 05:45:46.365562303 +0000 UTC m=+1173.283211147" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.390499 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4c5fb"] Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.831901 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jfvhf"] Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.833149 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.836015 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.844597 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jfvhf"] Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovs-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovn-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-combined-ca-bundle\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047dda74-4541-43e2-bc0f-ebdd951d1dbf-config\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:46 crc kubenswrapper[4717]: I0308 05:45:46.942882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbln\" (UniqueName: \"kubernetes.io/projected/047dda74-4541-43e2-bc0f-ebdd951d1dbf-kube-api-access-fdbln\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044636 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovs-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovn-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-combined-ca-bundle\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044852 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047dda74-4541-43e2-bc0f-ebdd951d1dbf-config\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.044937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbln\" (UniqueName: \"kubernetes.io/projected/047dda74-4541-43e2-bc0f-ebdd951d1dbf-kube-api-access-fdbln\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.045024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovn-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.045024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/047dda74-4541-43e2-bc0f-ebdd951d1dbf-ovs-rundir\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.045789 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047dda74-4541-43e2-bc0f-ebdd951d1dbf-config\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.052751 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-combined-ca-bundle\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.070609 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/047dda74-4541-43e2-bc0f-ebdd951d1dbf-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.070635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbln\" (UniqueName: \"kubernetes.io/projected/047dda74-4541-43e2-bc0f-ebdd951d1dbf-kube-api-access-fdbln\") pod \"ovn-controller-metrics-jfvhf\" (UID: \"047dda74-4541-43e2-bc0f-ebdd951d1dbf\") " pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.161882 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jfvhf" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.253303 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.287815 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.289178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.293201 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.298306 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.367055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4c5fb" event={"ID":"0bed90a3-1840-4f1a-a71b-cad45398bd15","Type":"ContainerStarted","Data":"493a78fdf6ce3457ea41b1562c9809ab2e541b450225707339ab8a2edecdfcd7"} Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.369624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"353c8ab9-3710-4290-b5c4-b93339baf4da","Type":"ContainerStarted","Data":"10fd26e84cd51bfc62b5fe9cd921d9bdea5de79ea897611ee92ace8ceac2986a"} Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.463852 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.464205 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.464256 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srkd2\" (UniqueName: \"kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.464336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.565797 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.565866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srkd2\" (UniqueName: \"kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.565936 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.565952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.566810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.567441 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.568561 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.594104 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srkd2\" (UniqueName: \"kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2\") pod \"dnsmasq-dns-5bbfd875c9-ssk72\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.619178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.964449 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.965636 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.969257 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.970004 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.970287 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.970399 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-64g29" Mar 08 05:45:47 crc kubenswrapper[4717]: I0308 05:45:47.976769 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073862 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073877 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.073917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.074155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.074261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd7r\" (UniqueName: \"kubernetes.io/projected/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-kube-api-access-7gd7r\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.175943 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.175981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176014 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176045 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176060 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd7r\" (UniqueName: \"kubernetes.io/projected/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-kube-api-access-7gd7r\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.176416 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.177645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-config\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.177791 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.179854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.181331 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.187430 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.194357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.195136 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd7r\" (UniqueName: \"kubernetes.io/projected/0427b7fd-2766-4b7f-bb23-96df1b2f4f5c-kube-api-access-7gd7r\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.209580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:48 crc kubenswrapper[4717]: I0308 05:45:48.309193 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.003212 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jfvhf"] Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.059492 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 05:45:49 crc kubenswrapper[4717]: W0308 05:45:49.066144 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0427b7fd_2766_4b7f_bb23_96df1b2f4f5c.slice/crio-8460b093400ffba5e18700ba12a0bb6c0147d4d8c619931fb80fc3f7c249be7c WatchSource:0}: Error finding container 8460b093400ffba5e18700ba12a0bb6c0147d4d8c619931fb80fc3f7c249be7c: Status 404 returned error can't find the container with id 8460b093400ffba5e18700ba12a0bb6c0147d4d8c619931fb80fc3f7c249be7c Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.137752 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.388264 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c","Type":"ContainerStarted","Data":"8460b093400ffba5e18700ba12a0bb6c0147d4d8c619931fb80fc3f7c249be7c"} Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.390948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" event={"ID":"66f71441-f7d0-490c-817b-c0971f4c1fa2","Type":"ContainerStarted","Data":"3533074e2eb2467acf832901e292d04f2e35eafb7315eaad1144e4b3f70f7e00"} Mar 08 05:45:49 crc kubenswrapper[4717]: I0308 05:45:49.392307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jfvhf" event={"ID":"047dda74-4541-43e2-bc0f-ebdd951d1dbf","Type":"ContainerStarted","Data":"f7bc19c756161863eedcf73e920c76eeaf8af917d8d4352c2e492c2add64536b"} Mar 08 05:45:52 crc kubenswrapper[4717]: I0308 05:45:52.425641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerStarted","Data":"0d1fa0d6d0a849b60d021551aa1fab1224a09ede8f8e625b4b99600fa195f44a"} Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.130945 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549146-t9dcp"] Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.133101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.137697 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.137878 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.138198 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.155214 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549146-t9dcp"] Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.276151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d677j\" (UniqueName: \"kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j\") pod \"auto-csr-approver-29549146-t9dcp\" (UID: \"c20fb9c8-328d-494c-b578-4abc028448bd\") " pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.377647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d677j\" (UniqueName: \"kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j\") pod \"auto-csr-approver-29549146-t9dcp\" (UID: \"c20fb9c8-328d-494c-b578-4abc028448bd\") " pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.396328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d677j\" (UniqueName: \"kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j\") pod \"auto-csr-approver-29549146-t9dcp\" (UID: \"c20fb9c8-328d-494c-b578-4abc028448bd\") " pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.483825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:00 crc kubenswrapper[4717]: I0308 05:46:00.759003 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 05:46:01 crc kubenswrapper[4717]: I0308 05:46:01.542182 4717 generic.go:334] "Generic (PLEG): container finished" podID="d81933c3-0769-427a-a494-9cfd438d269d" containerID="0d1fa0d6d0a849b60d021551aa1fab1224a09ede8f8e625b4b99600fa195f44a" exitCode=0 Mar 08 05:46:01 crc kubenswrapper[4717]: I0308 05:46:01.542226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerDied","Data":"0d1fa0d6d0a849b60d021551aa1fab1224a09ede8f8e625b4b99600fa195f44a"} Mar 08 05:46:04 crc kubenswrapper[4717]: I0308 05:46:04.119615 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:46:04 crc kubenswrapper[4717]: I0308 05:46:04.119930 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:46:08 crc kubenswrapper[4717]: E0308 05:46:08.185591 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c" Mar 08 05:46:08 crc kubenswrapper[4717]: E0308 05:46:08.186086 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n99h5fbh698h9h64ch65h5cdh54dh64bh549h586h65fh567h78h5dch649hd6h686hdh57fh66h647h55ch5cchbdh5f8h59fh68ch5b8h5b4h5f9hdq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovs-rundir,ReadOnly:true,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:true,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdbln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-metrics-jfvhf_openstack(047dda74-4541-43e2-bc0f-ebdd951d1dbf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:08 crc kubenswrapper[4717]: E0308 05:46:08.187266 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-metrics-jfvhf" podUID="047dda74-4541-43e2-bc0f-ebdd951d1dbf" Mar 08 05:46:08 crc kubenswrapper[4717]: E0308 05:46:08.607783 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c\\\"\"" pod="openstack/ovn-controller-metrics-jfvhf" podUID="047dda74-4541-43e2-bc0f-ebdd951d1dbf" Mar 08 05:46:14 crc kubenswrapper[4717]: E0308 05:46:14.975878 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-memcached:current" Mar 08 05:46:14 crc kubenswrapper[4717]: E0308 05:46:14.976503 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-memcached:current" Mar 08 05:46:14 crc kubenswrapper[4717]: E0308 05:46:14.976751 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-master-centos10/openstack-memcached:current,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n85hb6hd9h56h648h679h685h76hb6h668h7h67chbdh7h654hfh59dhfch57fh54ch665h5d5h65h8bh586h587h7bh649h68h5d6h678h649q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ft68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(ee8a4411-d973-4eeb-b6cd-eb0844e7826e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:14 crc kubenswrapper[4717]: E0308 05:46:14.978035 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="ee8a4411-d973-4eeb-b6cd-eb0844e7826e" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.665812 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-memcached:current\\\"\"" pod="openstack/memcached-0" podUID="ee8a4411-d973-4eeb-b6cd-eb0844e7826e" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.969492 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.969916 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.971663 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ssdq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(d4a94056-9d2f-45ef-afa3-cf858787fc87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.973246 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.996806 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.996864 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Mar 08 05:46:15 crc kubenswrapper[4717]: E0308 05:46:15.996994 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqspj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(739f45be-d031-4f80-9c39-1683ddff1289): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.000626 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="739f45be-d031-4f80-9c39-1683ddff1289" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.004279 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.004320 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.004449 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hd4km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7ce570a4-b883-4b07-a4a2-e5e820ab538c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.005795 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.020498 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.020540 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.020652 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlk2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod notifications-rabbitmq-server-0_openstack(f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.021806 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/notifications-rabbitmq-server-0" podUID="f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.671322 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.671507 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current\\\"\"" pod="openstack/openstack-galera-0" podUID="739f45be-d031-4f80-9c39-1683ddff1289" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.671551 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" Mar 08 05:46:16 crc kubenswrapper[4717]: E0308 05:46:16.672077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/notifications-rabbitmq-server-0" podUID="f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.456568 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.456931 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.457098 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n75h598h5cdh59h5b7h579h685h5cdh547h54ch646h55ch7h65bh65dh566h554h574h68h5cdh5ch5d4h6dh5f8h88h88hc9h68bh585h57ch684hb9q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4v9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(353c8ab9-3710-4290-b5c4-b93339baf4da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:18 crc kubenswrapper[4717]: I0308 05:46:18.459560 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.720126 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.720204 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.720426 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54ch5c4hfbh7fh674hf4h676h5c8h56dh7ch5f9hcchddh64bh5b8h57h645h85h55h5c8h648h5fhb6h8ch58h647h54bh7h56dh6dh98h65dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px8tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-mcfsn_openstack(52c7f6df-8563-4181-bc1e-6fb4c3dd2126): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:18 crc kubenswrapper[4717]: E0308 05:46:18.721844 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-mcfsn" podUID="52c7f6df-8563-4181-bc1e-6fb4c3dd2126" Mar 08 05:46:19 crc kubenswrapper[4717]: E0308 05:46:19.723608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current\\\"\"" pod="openstack/ovn-controller-mcfsn" podUID="52c7f6df-8563-4181-bc1e-6fb4c3dd2126" Mar 08 05:46:23 crc kubenswrapper[4717]: E0308 05:46:23.537959 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current" Mar 08 05:46:23 crc kubenswrapper[4717]: E0308 05:46:23.538126 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current" Mar 08 05:46:23 crc kubenswrapper[4717]: E0308 05:46:23.538451 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n86h7dh569h664h5b6h674hbdh5fbh59hc8h55dh688h9h56h644hd7h55ch675h657h654h5dfh68h77h86h5d9h57h55chb5h5bfh8h5fh54dq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7gd7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(0427b7fd-2766-4b7f-bb23-96df1b2f4f5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:23 crc kubenswrapper[4717]: I0308 05:46:23.976024 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549146-t9dcp"] Mar 08 05:46:24 crc kubenswrapper[4717]: W0308 05:46:24.408897 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20fb9c8_328d_494c_b578_4abc028448bd.slice/crio-9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc WatchSource:0}: Error finding container 9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc: Status 404 returned error can't find the container with id 9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.430313 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.430376 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.430574 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n86h5d4h694h586h85h588h66bh5c8hfh65fh64fh4hb6h555h679h99h66bhddh696h698h65bhcfh64fhd8h88h5bdh686h66chbdh68bh588h66cq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srkd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5bbfd875c9-ssk72_openstack(66f71441-f7d0-490c-817b-c0971f4c1fa2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.431840 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" podUID="66f71441-f7d0-490c-817b-c0971f4c1fa2" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.498646 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.499082 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.499213 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2chks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86b8f4ff9-jtq2p_openstack(b424873e-033e-4970-be30-3481fa57c5fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.502437 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" podUID="b424873e-033e-4970-be30-3481fa57c5fc" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.504061 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.504089 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.504169 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kl47w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b9b4959cc-zgdqn_openstack(8822d5b2-773c-48fa-90e8-09f890b9ca6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.509756 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" podUID="8822d5b2-773c-48fa-90e8-09f890b9ca6a" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.518077 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.518132 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.518245 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n77hb9hddhdfhf5h5cch698h578h5f8h675h5c5hdch97h5bch59bh5b6h55h5bch556hb5h599h8dhc8h667h59ch659h578hcfh5c7h9dh645h554q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f7j9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7f7d487d45-ljhqr_openstack(3a759386-18a4-4dd0-9b3c-1dafec1eb845): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.519532 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" podUID="3a759386-18a4-4dd0-9b3c-1dafec1eb845" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.526030 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.526079 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.526215 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24jjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-545d49fd5c-ph479_openstack(48e7ebe7-7af6-4e99-bef7-8daaff76b916): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.527544 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" podUID="48e7ebe7-7af6-4e99-bef7-8daaff76b916" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.541465 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.541516 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.541631 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2lrbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8468885bfc-bz488_openstack(7957c7d8-a7bf-4381-be99-a48f6ede8f50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.542752 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8468885bfc-bz488" podUID="7957c7d8-a7bf-4381-be99-a48f6ede8f50" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.694492 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="353c8ab9-3710-4290-b5c4-b93339baf4da" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.742058 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="0427b7fd-2766-4b7f-bb23-96df1b2f4f5c" Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.760891 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"353c8ab9-3710-4290-b5c4-b93339baf4da","Type":"ContainerStarted","Data":"55020683da85aa944338990802bba2b16ccfce23b348e353b3e4e04f2d74af07"} Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.762953 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="353c8ab9-3710-4290-b5c4-b93339baf4da" Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.763936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerStarted","Data":"86d872f2255e9601628014092ab8d5a682407736b62dfc2349632a05be42f8be"} Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.765592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" event={"ID":"c20fb9c8-328d-494c-b578-4abc028448bd","Type":"ContainerStarted","Data":"9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc"} Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.769991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c","Type":"ContainerStarted","Data":"aeec6b0c2672c5fdd51c1ec5238d430683b5e8c838dfe31bf51d0be63ae4dc96"} Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.771508 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="0427b7fd-2766-4b7f-bb23-96df1b2f4f5c" Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.774512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jfvhf" event={"ID":"047dda74-4541-43e2-bc0f-ebdd951d1dbf","Type":"ContainerStarted","Data":"b551423295909f8469aad99b83ca2bb42f5999ac6912043459f19a556860aa93"} Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.777531 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e4e6ff9-db68-44fc-a8d2-de9471a74f19","Type":"ContainerStarted","Data":"5291a481026bd5a521a3f28c93d3b2deb3687b2c5a56696bbadc4093a076b187"} Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.783423 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current\\\"\"" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" podUID="66f71441-f7d0-490c-817b-c0971f4c1fa2" Mar 08 05:46:24 crc kubenswrapper[4717]: E0308 05:46:24.783935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current\\\"\"" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" podUID="8822d5b2-773c-48fa-90e8-09f890b9ca6a" Mar 08 05:46:24 crc kubenswrapper[4717]: I0308 05:46:24.832451 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jfvhf" podStartSLOduration=3.405157794 podStartE2EDuration="38.832428339s" podCreationTimestamp="2026-03-08 05:45:46 +0000 UTC" firstStartedPulling="2026-03-08 05:45:49.021434068 +0000 UTC m=+1175.939082912" lastFinishedPulling="2026-03-08 05:46:24.448704603 +0000 UTC m=+1211.366353457" observedRunningTime="2026-03-08 05:46:24.828213514 +0000 UTC m=+1211.745862358" watchObservedRunningTime="2026-03-08 05:46:24.832428339 +0000 UTC m=+1211.750077183" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.181068 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.257153 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc\") pod \"b424873e-033e-4970-be30-3481fa57c5fc\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.257208 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config\") pod \"b424873e-033e-4970-be30-3481fa57c5fc\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.257228 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2chks\" (UniqueName: \"kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks\") pod \"b424873e-033e-4970-be30-3481fa57c5fc\" (UID: \"b424873e-033e-4970-be30-3481fa57c5fc\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.258575 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b424873e-033e-4970-be30-3481fa57c5fc" (UID: "b424873e-033e-4970-be30-3481fa57c5fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.258797 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config" (OuterVolumeSpecName: "config") pod "b424873e-033e-4970-be30-3481fa57c5fc" (UID: "b424873e-033e-4970-be30-3481fa57c5fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.265860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks" (OuterVolumeSpecName: "kube-api-access-2chks") pod "b424873e-033e-4970-be30-3481fa57c5fc" (UID: "b424873e-033e-4970-be30-3481fa57c5fc"). InnerVolumeSpecName "kube-api-access-2chks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.327229 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.361203 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.361236 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b424873e-033e-4970-be30-3481fa57c5fc-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.361248 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2chks\" (UniqueName: \"kubernetes.io/projected/b424873e-033e-4970-be30-3481fa57c5fc-kube-api-access-2chks\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.366126 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.367372 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.369613 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.439578 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.463035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.463093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmsb\" (UniqueName: \"kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.463121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.463151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.463241 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.565412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.565480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmsb\" (UniqueName: \"kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.565513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.565537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.566767 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.570942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.571913 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.572065 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.572725 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.597538 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmsb\" (UniqueName: \"kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb\") pod \"dnsmasq-dns-7dc8b7f6fc-2jcgc\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.698649 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.710533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.711534 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.765787 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774146 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrbn\" (UniqueName: \"kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn\") pod \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774258 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config\") pod \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774281 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7j9f\" (UniqueName: \"kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f\") pod \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774337 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc\") pod \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774366 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config\") pod \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774393 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config\") pod \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\" (UID: \"7957c7d8-a7bf-4381-be99-a48f6ede8f50\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774511 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24jjs\" (UniqueName: \"kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs\") pod \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\" (UID: \"48e7ebe7-7af6-4e99-bef7-8daaff76b916\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774544 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc\") pod \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\" (UID: \"3a759386-18a4-4dd0-9b3c-1dafec1eb845\") " Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774629 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config" (OuterVolumeSpecName: "config") pod "48e7ebe7-7af6-4e99-bef7-8daaff76b916" (UID: "48e7ebe7-7af6-4e99-bef7-8daaff76b916"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774871 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.774909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config" (OuterVolumeSpecName: "config") pod "3a759386-18a4-4dd0-9b3c-1dafec1eb845" (UID: "3a759386-18a4-4dd0-9b3c-1dafec1eb845"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.775149 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config" (OuterVolumeSpecName: "config") pod "7957c7d8-a7bf-4381-be99-a48f6ede8f50" (UID: "7957c7d8-a7bf-4381-be99-a48f6ede8f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.775161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48e7ebe7-7af6-4e99-bef7-8daaff76b916" (UID: "48e7ebe7-7af6-4e99-bef7-8daaff76b916"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.775227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a759386-18a4-4dd0-9b3c-1dafec1eb845" (UID: "3a759386-18a4-4dd0-9b3c-1dafec1eb845"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.780333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn" (OuterVolumeSpecName: "kube-api-access-2lrbn") pod "7957c7d8-a7bf-4381-be99-a48f6ede8f50" (UID: "7957c7d8-a7bf-4381-be99-a48f6ede8f50"). InnerVolumeSpecName "kube-api-access-2lrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.783191 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f" (OuterVolumeSpecName: "kube-api-access-f7j9f") pod "3a759386-18a4-4dd0-9b3c-1dafec1eb845" (UID: "3a759386-18a4-4dd0-9b3c-1dafec1eb845"). InnerVolumeSpecName "kube-api-access-f7j9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.785770 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs" (OuterVolumeSpecName: "kube-api-access-24jjs") pod "48e7ebe7-7af6-4e99-bef7-8daaff76b916" (UID: "48e7ebe7-7af6-4e99-bef7-8daaff76b916"). InnerVolumeSpecName "kube-api-access-24jjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.787238 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-bz488" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.788505 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.805096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-bz488" event={"ID":"7957c7d8-a7bf-4381-be99-a48f6ede8f50","Type":"ContainerDied","Data":"dac4563fdb7a62c3aa2e371a0c83f099abba407deb1fd359b0070cc06584374b"} Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.805214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-ph479" event={"ID":"48e7ebe7-7af6-4e99-bef7-8daaff76b916","Type":"ContainerDied","Data":"57e94399a96729092c441aa6ed8ac1013f1223f358710b0ff99a62991c7b7ce3"} Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.808960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" event={"ID":"3a759386-18a4-4dd0-9b3c-1dafec1eb845","Type":"ContainerDied","Data":"5e2ed0da642e15fef4d9baa84757e54dd22ea53d061ae9bf7d3e006190c4cb62"} Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.809082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7d487d45-ljhqr" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.832487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" event={"ID":"b424873e-033e-4970-be30-3481fa57c5fc","Type":"ContainerDied","Data":"ba62510be5a7ba66d0b9541f72e5a7c2dd5de3313489398e77ea49a8561b0136"} Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.832585 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-jtq2p" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.838256 4717 generic.go:334] "Generic (PLEG): container finished" podID="0bed90a3-1840-4f1a-a71b-cad45398bd15" containerID="a1d9b54ae05eda6388a794f39fca11b1cfac28670351be52b112c1c7156e5b05" exitCode=0 Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.838435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4c5fb" event={"ID":"0bed90a3-1840-4f1a-a71b-cad45398bd15","Type":"ContainerDied","Data":"a1d9b54ae05eda6388a794f39fca11b1cfac28670351be52b112c1c7156e5b05"} Mar 08 05:46:25 crc kubenswrapper[4717]: E0308 05:46:25.841465 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="353c8ab9-3710-4290-b5c4-b93339baf4da" Mar 08 05:46:25 crc kubenswrapper[4717]: E0308 05:46:25.845762 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="0427b7fd-2766-4b7f-bb23-96df1b2f4f5c" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876019 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48e7ebe7-7af6-4e99-bef7-8daaff76b916-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876044 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876052 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7957c7d8-a7bf-4381-be99-a48f6ede8f50-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876061 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24jjs\" (UniqueName: \"kubernetes.io/projected/48e7ebe7-7af6-4e99-bef7-8daaff76b916-kube-api-access-24jjs\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876096 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a759386-18a4-4dd0-9b3c-1dafec1eb845-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876105 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrbn\" (UniqueName: \"kubernetes.io/projected/7957c7d8-a7bf-4381-be99-a48f6ede8f50-kube-api-access-2lrbn\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:25 crc kubenswrapper[4717]: I0308 05:46:25.876113 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7j9f\" (UniqueName: \"kubernetes.io/projected/3a759386-18a4-4dd0-9b3c-1dafec1eb845-kube-api-access-f7j9f\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.011694 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.019214 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f7d487d45-ljhqr"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.038721 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.044201 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-ph479"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.068215 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.075226 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-bz488"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.088398 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.097035 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-jtq2p"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.217263 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.375167 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.384162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl47w\" (UniqueName: \"kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w\") pod \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.384306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc\") pod \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.384343 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config\") pod \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\" (UID: \"8822d5b2-773c-48fa-90e8-09f890b9ca6a\") " Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.385186 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config" (OuterVolumeSpecName: "config") pod "8822d5b2-773c-48fa-90e8-09f890b9ca6a" (UID: "8822d5b2-773c-48fa-90e8-09f890b9ca6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.386122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8822d5b2-773c-48fa-90e8-09f890b9ca6a" (UID: "8822d5b2-773c-48fa-90e8-09f890b9ca6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.486362 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.486648 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8822d5b2-773c-48fa-90e8-09f890b9ca6a-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.578186 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w" (OuterVolumeSpecName: "kube-api-access-kl47w") pod "8822d5b2-773c-48fa-90e8-09f890b9ca6a" (UID: "8822d5b2-773c-48fa-90e8-09f890b9ca6a"). InnerVolumeSpecName "kube-api-access-kl47w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.589766 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl47w\" (UniqueName: \"kubernetes.io/projected/8822d5b2-773c-48fa-90e8-09f890b9ca6a-kube-api-access-kl47w\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.849842 4717 generic.go:334] "Generic (PLEG): container finished" podID="c20fb9c8-328d-494c-b578-4abc028448bd" containerID="46a3f6ec15a16c4fee0133afccc02980a0d3834b1cb31a4698198b560a3df85b" exitCode=0 Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.849937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" event={"ID":"c20fb9c8-328d-494c-b578-4abc028448bd","Type":"ContainerDied","Data":"46a3f6ec15a16c4fee0133afccc02980a0d3834b1cb31a4698198b560a3df85b"} Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.854086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4c5fb" event={"ID":"0bed90a3-1840-4f1a-a71b-cad45398bd15","Type":"ContainerStarted","Data":"eaebbd515fcdef58dd3cac3790386de45c3f2e7dcc6351346d1e994f0722d66e"} Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.855402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" event={"ID":"22841d35-043a-442d-beca-decc3c750d66","Type":"ContainerStarted","Data":"6a1aacfd359257a40e1f11e9ac5f1cdb63497b07b873f898ff8b4f0915e0d139"} Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.856956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" event={"ID":"8822d5b2-773c-48fa-90e8-09f890b9ca6a","Type":"ContainerDied","Data":"b2311d8db3ad2289f0e07acd5c1a7cc18aa8544327dc92b51aadf3740b76e0de"} Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.857011 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-zgdqn" Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.864180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerStarted","Data":"2d97a3478ed536f81facdfe2b0853775e36e0c0937303da06b5778b97a7508c0"} Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.940064 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:46:26 crc kubenswrapper[4717]: I0308 05:46:26.946675 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-zgdqn"] Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.811591 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a759386-18a4-4dd0-9b3c-1dafec1eb845" path="/var/lib/kubelet/pods/3a759386-18a4-4dd0-9b3c-1dafec1eb845/volumes" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.812543 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e7ebe7-7af6-4e99-bef7-8daaff76b916" path="/var/lib/kubelet/pods/48e7ebe7-7af6-4e99-bef7-8daaff76b916/volumes" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.813416 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7957c7d8-a7bf-4381-be99-a48f6ede8f50" path="/var/lib/kubelet/pods/7957c7d8-a7bf-4381-be99-a48f6ede8f50/volumes" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.814342 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8822d5b2-773c-48fa-90e8-09f890b9ca6a" path="/var/lib/kubelet/pods/8822d5b2-773c-48fa-90e8-09f890b9ca6a/volumes" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.815127 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b424873e-033e-4970-be30-3481fa57c5fc" path="/var/lib/kubelet/pods/b424873e-033e-4970-be30-3481fa57c5fc/volumes" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.880108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4c5fb" event={"ID":"0bed90a3-1840-4f1a-a71b-cad45398bd15","Type":"ContainerStarted","Data":"8cc59b17d0c08ce4f90edb69696ea49b31d2aaa8d4496f5815795a359b1138f5"} Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.880503 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.880557 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.884926 4717 generic.go:334] "Generic (PLEG): container finished" podID="22841d35-043a-442d-beca-decc3c750d66" containerID="63b10a21e9f5b3ee560412875559e6b2d813d1c6fcd3c3d7b5f67eacb171079e" exitCode=0 Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.884997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" event={"ID":"22841d35-043a-442d-beca-decc3c750d66","Type":"ContainerDied","Data":"63b10a21e9f5b3ee560412875559e6b2d813d1c6fcd3c3d7b5f67eacb171079e"} Mar 08 05:46:27 crc kubenswrapper[4717]: I0308 05:46:27.920346 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4c5fb" podStartSLOduration=6.934497775 podStartE2EDuration="44.920326342s" podCreationTimestamp="2026-03-08 05:45:43 +0000 UTC" firstStartedPulling="2026-03-08 05:45:46.423996926 +0000 UTC m=+1173.341645770" lastFinishedPulling="2026-03-08 05:46:24.409825473 +0000 UTC m=+1211.327474337" observedRunningTime="2026-03-08 05:46:27.91374673 +0000 UTC m=+1214.831395614" watchObservedRunningTime="2026-03-08 05:46:27.920326342 +0000 UTC m=+1214.837975196" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.271794 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.465140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d677j\" (UniqueName: \"kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j\") pod \"c20fb9c8-328d-494c-b578-4abc028448bd\" (UID: \"c20fb9c8-328d-494c-b578-4abc028448bd\") " Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.470998 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j" (OuterVolumeSpecName: "kube-api-access-d677j") pod "c20fb9c8-328d-494c-b578-4abc028448bd" (UID: "c20fb9c8-328d-494c-b578-4abc028448bd"). InnerVolumeSpecName "kube-api-access-d677j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.566600 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d677j\" (UniqueName: \"kubernetes.io/projected/c20fb9c8-328d-494c-b578-4abc028448bd-kube-api-access-d677j\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.901701 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" event={"ID":"22841d35-043a-442d-beca-decc3c750d66","Type":"ContainerStarted","Data":"b94ec05f86cc041aa2434a2cf16033f0b4de1619b5baae7bbe9757bc75f8430a"} Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.902856 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.905416 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.905468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549146-t9dcp" event={"ID":"c20fb9c8-328d-494c-b578-4abc028448bd","Type":"ContainerDied","Data":"9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc"} Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.905490 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f0f52b004d51c1639c1379ab5f10cfa4c0c3274c2d4c4536ce750daf340d7dc" Mar 08 05:46:28 crc kubenswrapper[4717]: I0308 05:46:28.921005 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" podStartSLOduration=3.715451257 podStartE2EDuration="3.920986773s" podCreationTimestamp="2026-03-08 05:46:25 +0000 UTC" firstStartedPulling="2026-03-08 05:46:26.378002946 +0000 UTC m=+1213.295651800" lastFinishedPulling="2026-03-08 05:46:26.583538472 +0000 UTC m=+1213.501187316" observedRunningTime="2026-03-08 05:46:28.917935838 +0000 UTC m=+1215.835584702" watchObservedRunningTime="2026-03-08 05:46:28.920986773 +0000 UTC m=+1215.838635637" Mar 08 05:46:29 crc kubenswrapper[4717]: I0308 05:46:29.341458 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549140-mrfvv"] Mar 08 05:46:29 crc kubenswrapper[4717]: I0308 05:46:29.346973 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549140-mrfvv"] Mar 08 05:46:29 crc kubenswrapper[4717]: I0308 05:46:29.797011 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac4553d-f1b4-4587-b172-04c0823d4d67" path="/var/lib/kubelet/pods/0ac4553d-f1b4-4587-b172-04c0823d4d67/volumes" Mar 08 05:46:29 crc kubenswrapper[4717]: I0308 05:46:29.941081 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"739f45be-d031-4f80-9c39-1683ddff1289","Type":"ContainerStarted","Data":"fcb4c9c8f821cffb0165e4a566d7da3921bf8f1c36fd90afc37b5a992d0b163b"} Mar 08 05:46:30 crc kubenswrapper[4717]: I0308 05:46:30.957073 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerStarted","Data":"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e"} Mar 08 05:46:30 crc kubenswrapper[4717]: I0308 05:46:30.960577 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerStarted","Data":"f96498ca0245882030075fe2c8980ba30444b3170d792f5110267ec1c326806b"} Mar 08 05:46:30 crc kubenswrapper[4717]: I0308 05:46:30.962438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec","Type":"ContainerStarted","Data":"7f449e2f2e084acd4fbdbe8981b6c42aa64820889a21b9690b7df2b6ba9a3e74"} Mar 08 05:46:31 crc kubenswrapper[4717]: I0308 05:46:31.010776 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=5.51103264 podStartE2EDuration="51.010753839s" podCreationTimestamp="2026-03-08 05:45:40 +0000 UTC" firstStartedPulling="2026-03-08 05:45:44.98742001 +0000 UTC m=+1171.905068854" lastFinishedPulling="2026-03-08 05:46:30.487141189 +0000 UTC m=+1217.404790053" observedRunningTime="2026-03-08 05:46:31.00189262 +0000 UTC m=+1217.919541484" watchObservedRunningTime="2026-03-08 05:46:31.010753839 +0000 UTC m=+1217.928402683" Mar 08 05:46:31 crc kubenswrapper[4717]: I0308 05:46:31.973260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ee8a4411-d973-4eeb-b6cd-eb0844e7826e","Type":"ContainerStarted","Data":"accce8a029bf0816cffe9df0ea3a2dc5b9297616668e36f84acb9c4033a26ac1"} Mar 08 05:46:31 crc kubenswrapper[4717]: I0308 05:46:31.974267 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 05:46:32 crc kubenswrapper[4717]: I0308 05:46:32.001444 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.916736352 podStartE2EDuration="55.001420502s" podCreationTimestamp="2026-03-08 05:45:37 +0000 UTC" firstStartedPulling="2026-03-08 05:45:38.916339809 +0000 UTC m=+1165.833988653" lastFinishedPulling="2026-03-08 05:46:31.001023949 +0000 UTC m=+1217.918672803" observedRunningTime="2026-03-08 05:46:31.997731731 +0000 UTC m=+1218.915380585" watchObservedRunningTime="2026-03-08 05:46:32.001420502 +0000 UTC m=+1218.919069356" Mar 08 05:46:32 crc kubenswrapper[4717]: I0308 05:46:32.660381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:32 crc kubenswrapper[4717]: I0308 05:46:32.985353 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerStarted","Data":"ae723f586b191c6db521638d50b64a0c103a18f55e7dfb6f19046bc98fd39696"} Mar 08 05:46:33 crc kubenswrapper[4717]: I0308 05:46:33.996448 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e4e6ff9-db68-44fc-a8d2-de9471a74f19" containerID="5291a481026bd5a521a3f28c93d3b2deb3687b2c5a56696bbadc4093a076b187" exitCode=0 Mar 08 05:46:33 crc kubenswrapper[4717]: I0308 05:46:33.996578 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e4e6ff9-db68-44fc-a8d2-de9471a74f19","Type":"ContainerDied","Data":"5291a481026bd5a521a3f28c93d3b2deb3687b2c5a56696bbadc4093a076b187"} Mar 08 05:46:34 crc kubenswrapper[4717]: I0308 05:46:34.120491 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:46:34 crc kubenswrapper[4717]: I0308 05:46:34.120555 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:46:34 crc kubenswrapper[4717]: I0308 05:46:34.120598 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:46:34 crc kubenswrapper[4717]: I0308 05:46:34.121285 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:46:34 crc kubenswrapper[4717]: I0308 05:46:34.121341 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b" gracePeriod=600 Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.008864 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9e4e6ff9-db68-44fc-a8d2-de9471a74f19","Type":"ContainerStarted","Data":"e3be1c7a6f96de0472e8c05f0d6cc330ea766f0d4b2534e1da1bb1e20dee2282"} Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.015371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn" event={"ID":"52c7f6df-8563-4181-bc1e-6fb4c3dd2126","Type":"ContainerStarted","Data":"fff849dad451b0cd9ecbea969e90ccafc574677cf7b3eb065762ab9d41e770ff"} Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.015638 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mcfsn" Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.020591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b"} Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.020640 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b" exitCode=0 Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.020672 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e"} Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.020745 4717 scope.go:117] "RemoveContainer" containerID="c4b2434c01f53ad405ba837cb47237c7e26c6fdc63e5e92c263085831d1dc0d5" Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.055181 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=14.633609189 podStartE2EDuration="59.055162232s" podCreationTimestamp="2026-03-08 05:45:36 +0000 UTC" firstStartedPulling="2026-03-08 05:45:38.786551264 +0000 UTC m=+1165.704200108" lastFinishedPulling="2026-03-08 05:46:23.208104267 +0000 UTC m=+1210.125753151" observedRunningTime="2026-03-08 05:46:35.049106993 +0000 UTC m=+1221.966755847" watchObservedRunningTime="2026-03-08 05:46:35.055162232 +0000 UTC m=+1221.972811086" Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.079701 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mcfsn" podStartSLOduration=3.940984332 podStartE2EDuration="52.079670528s" podCreationTimestamp="2026-03-08 05:45:43 +0000 UTC" firstStartedPulling="2026-03-08 05:45:45.831128445 +0000 UTC m=+1172.748777289" lastFinishedPulling="2026-03-08 05:46:33.969814631 +0000 UTC m=+1220.887463485" observedRunningTime="2026-03-08 05:46:35.070617144 +0000 UTC m=+1221.988265988" watchObservedRunningTime="2026-03-08 05:46:35.079670528 +0000 UTC m=+1221.997319392" Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.768777 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:35 crc kubenswrapper[4717]: I0308 05:46:35.848955 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.187432 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314008 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config\") pod \"66f71441-f7d0-490c-817b-c0971f4c1fa2\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srkd2\" (UniqueName: \"kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2\") pod \"66f71441-f7d0-490c-817b-c0971f4c1fa2\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314178 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb\") pod \"66f71441-f7d0-490c-817b-c0971f4c1fa2\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314299 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc\") pod \"66f71441-f7d0-490c-817b-c0971f4c1fa2\" (UID: \"66f71441-f7d0-490c-817b-c0971f4c1fa2\") " Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314800 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config" (OuterVolumeSpecName: "config") pod "66f71441-f7d0-490c-817b-c0971f4c1fa2" (UID: "66f71441-f7d0-490c-817b-c0971f4c1fa2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.314835 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66f71441-f7d0-490c-817b-c0971f4c1fa2" (UID: "66f71441-f7d0-490c-817b-c0971f4c1fa2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.315037 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66f71441-f7d0-490c-817b-c0971f4c1fa2" (UID: "66f71441-f7d0-490c-817b-c0971f4c1fa2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.321309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2" (OuterVolumeSpecName: "kube-api-access-srkd2") pod "66f71441-f7d0-490c-817b-c0971f4c1fa2" (UID: "66f71441-f7d0-490c-817b-c0971f4c1fa2"). InnerVolumeSpecName "kube-api-access-srkd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.416081 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.416470 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.416483 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srkd2\" (UniqueName: \"kubernetes.io/projected/66f71441-f7d0-490c-817b-c0971f4c1fa2-kube-api-access-srkd2\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:36 crc kubenswrapper[4717]: I0308 05:46:36.416497 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66f71441-f7d0-490c-817b-c0971f4c1fa2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:37 crc kubenswrapper[4717]: I0308 05:46:37.051005 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" event={"ID":"66f71441-f7d0-490c-817b-c0971f4c1fa2","Type":"ContainerDied","Data":"3533074e2eb2467acf832901e292d04f2e35eafb7315eaad1144e4b3f70f7e00"} Mar 08 05:46:37 crc kubenswrapper[4717]: I0308 05:46:37.051080 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bbfd875c9-ssk72" Mar 08 05:46:37 crc kubenswrapper[4717]: I0308 05:46:37.149657 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:46:37 crc kubenswrapper[4717]: I0308 05:46:37.162454 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bbfd875c9-ssk72"] Mar 08 05:46:37 crc kubenswrapper[4717]: I0308 05:46:37.792434 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f71441-f7d0-490c-817b-c0971f4c1fa2" path="/var/lib/kubelet/pods/66f71441-f7d0-490c-817b-c0971f4c1fa2/volumes" Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.064228 4717 generic.go:334] "Generic (PLEG): container finished" podID="739f45be-d031-4f80-9c39-1683ddff1289" containerID="fcb4c9c8f821cffb0165e4a566d7da3921bf8f1c36fd90afc37b5a992d0b163b" exitCode=0 Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.064341 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"739f45be-d031-4f80-9c39-1683ddff1289","Type":"ContainerDied","Data":"fcb4c9c8f821cffb0165e4a566d7da3921bf8f1c36fd90afc37b5a992d0b163b"} Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.066961 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"353c8ab9-3710-4290-b5c4-b93339baf4da","Type":"ContainerStarted","Data":"57abe2d109d9c03334fec2064e23f5c13bfbfe6a2e9604119dcdb91d880aed3b"} Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.124437 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.501949735 podStartE2EDuration="56.124415566s" podCreationTimestamp="2026-03-08 05:45:42 +0000 UTC" firstStartedPulling="2026-03-08 05:45:46.332276961 +0000 UTC m=+1173.249925815" lastFinishedPulling="2026-03-08 05:46:36.954742762 +0000 UTC m=+1223.872391646" observedRunningTime="2026-03-08 05:46:38.116077751 +0000 UTC m=+1225.033726625" watchObservedRunningTime="2026-03-08 05:46:38.124415566 +0000 UTC m=+1225.042064430" Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.264425 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.264470 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.332837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 05:46:38 crc kubenswrapper[4717]: I0308 05:46:38.333754 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.079934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"739f45be-d031-4f80-9c39-1683ddff1289","Type":"ContainerStarted","Data":"3b056ed1e9b5de857beb265d7a29fe46dc33ea3a17f9f658cc8db7c7a5242092"} Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.082886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0427b7fd-2766-4b7f-bb23-96df1b2f4f5c","Type":"ContainerStarted","Data":"a7f38764d6d3091c32bad4aa330675cf5dc611bcdb89098d72655895e39de7dc"} Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.104078 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371972.75072 podStartE2EDuration="1m4.104056467s" podCreationTimestamp="2026-03-08 05:45:35 +0000 UTC" firstStartedPulling="2026-03-08 05:45:37.138562789 +0000 UTC m=+1164.056211633" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:46:39.102797476 +0000 UTC m=+1226.020446320" watchObservedRunningTime="2026-03-08 05:46:39.104056467 +0000 UTC m=+1226.021705321" Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.138629 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.266104564 podStartE2EDuration="53.138609281s" podCreationTimestamp="2026-03-08 05:45:46 +0000 UTC" firstStartedPulling="2026-03-08 05:45:49.069395152 +0000 UTC m=+1175.987043996" lastFinishedPulling="2026-03-08 05:46:37.941899859 +0000 UTC m=+1224.859548713" observedRunningTime="2026-03-08 05:46:39.126786529 +0000 UTC m=+1226.044435383" watchObservedRunningTime="2026-03-08 05:46:39.138609281 +0000 UTC m=+1226.056258145" Mar 08 05:46:39 crc kubenswrapper[4717]: E0308 05:46:39.156784 4717 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.44:38318->38.102.83.44:42899: write tcp 38.102.83.44:38318->38.102.83.44:42899: write: broken pipe Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.309405 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 05:46:39 crc kubenswrapper[4717]: I0308 05:46:39.334098 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.870662 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:46:40 crc kubenswrapper[4717]: E0308 05:46:40.871338 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20fb9c8-328d-494c-b578-4abc028448bd" containerName="oc" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.871352 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20fb9c8-328d-494c-b578-4abc028448bd" containerName="oc" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.871518 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20fb9c8-328d-494c-b578-4abc028448bd" containerName="oc" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.872476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.915496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.915549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.915614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.916044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lhw\" (UniqueName: \"kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.916450 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:40 crc kubenswrapper[4717]: I0308 05:46:40.934596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.018887 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.018949 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.019026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.019083 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lhw\" (UniqueName: \"kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.019110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.020287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.020986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.021860 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.022018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.046044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lhw\" (UniqueName: \"kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw\") pod \"dnsmasq-dns-79bf9dcd95-lbcd4\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.188996 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.399270 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.666385 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.953824 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.961411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.970330 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.970809 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-kppsl" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.970824 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.970936 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 05:46:41 crc kubenswrapper[4717]: I0308 05:46:41.991051 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039358 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-lock\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039411 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7rd\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-kube-api-access-6r7rd\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039449 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a11de8-b5e8-40d8-a451-1bece45918d8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039593 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-cache\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.039830 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.043245 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wc6z2"] Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.044628 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.046553 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.046798 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.046942 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.077372 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wc6z2"] Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.111026 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" event={"ID":"3406523d-3819-494f-9270-c6ad58910d30","Type":"ContainerStarted","Data":"3e80510e75069e00c491f33e3f91b189c95cbdb60f3054e0800ace965f0d7419"} Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.141814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.141872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.141908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.141943 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.141977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142045 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142131 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-lock\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7rd\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-kube-api-access-6r7rd\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142348 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.142538 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.142559 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.142596 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift podName:67a11de8-b5e8-40d8-a451-1bece45918d8 nodeName:}" failed. No retries permitted until 2026-03-08 05:46:42.642581792 +0000 UTC m=+1229.560230636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift") pod "swift-storage-0" (UID: "67a11de8-b5e8-40d8-a451-1bece45918d8") : configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a11de8-b5e8-40d8-a451-1bece45918d8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-cache\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-lock\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.142887 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/67a11de8-b5e8-40d8-a451-1bece45918d8-cache\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.147971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a11de8-b5e8-40d8-a451-1bece45918d8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.158904 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7rd\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-kube-api-access-6r7rd\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.163755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244090 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.244617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.245002 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.245234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.247999 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.248435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.252116 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.260283 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd\") pod \"swift-ring-rebalance-wc6z2\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.353774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.354317 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.406196 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.655501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.655861 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.656011 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: E0308 05:46:42.656115 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift podName:67a11de8-b5e8-40d8-a451-1bece45918d8 nodeName:}" failed. No retries permitted until 2026-03-08 05:46:43.656090833 +0000 UTC m=+1230.573739687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift") pod "swift-storage-0" (UID: "67a11de8-b5e8-40d8-a451-1bece45918d8") : configmap "swift-ring-files" not found Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.660520 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.663787 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:42 crc kubenswrapper[4717]: I0308 05:46:42.698325 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wc6z2"] Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.117952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wc6z2" event={"ID":"03d2941c-7434-4961-a7ea-fdff878a1128","Type":"ContainerStarted","Data":"2711bab0d4969ef6a9200460d2834af9c2e19b5b6f299b298402190a716bf0c3"} Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.120363 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.197318 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.672602 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:43 crc kubenswrapper[4717]: E0308 05:46:43.672913 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 05:46:43 crc kubenswrapper[4717]: E0308 05:46:43.673150 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 05:46:43 crc kubenswrapper[4717]: E0308 05:46:43.673263 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift podName:67a11de8-b5e8-40d8-a451-1bece45918d8 nodeName:}" failed. No retries permitted until 2026-03-08 05:46:45.67322971 +0000 UTC m=+1232.590878594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift") pod "swift-storage-0" (UID: "67a11de8-b5e8-40d8-a451-1bece45918d8") : configmap "swift-ring-files" not found Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.810118 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 05:46:43 crc kubenswrapper[4717]: I0308 05:46:43.942164 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.393804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.551428 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.554974 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.557728 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.557863 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.558036 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cb2gx" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.558152 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.569767 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.592939 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-config\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593007 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-scripts\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593240 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.593290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd44v\" (UniqueName: \"kubernetes.io/projected/bde07dc5-6141-42e2-b280-d4df5ebe3d61-kube-api-access-gd44v\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.694770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-config\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.694833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.694878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.694907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.694971 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-scripts\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.695134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.695181 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd44v\" (UniqueName: \"kubernetes.io/projected/bde07dc5-6141-42e2-b280-d4df5ebe3d61-kube-api-access-gd44v\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.696190 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-scripts\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.696194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde07dc5-6141-42e2-b280-d4df5ebe3d61-config\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.696318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.706307 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.718444 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.718875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bde07dc5-6141-42e2-b280-d4df5ebe3d61-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.722608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd44v\" (UniqueName: \"kubernetes.io/projected/bde07dc5-6141-42e2-b280-d4df5ebe3d61-kube-api-access-gd44v\") pod \"ovn-northd-0\" (UID: \"bde07dc5-6141-42e2-b280-d4df5ebe3d61\") " pod="openstack/ovn-northd-0" Mar 08 05:46:44 crc kubenswrapper[4717]: I0308 05:46:44.877211 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 05:46:45 crc kubenswrapper[4717]: I0308 05:46:45.145044 4717 generic.go:334] "Generic (PLEG): container finished" podID="3406523d-3819-494f-9270-c6ad58910d30" containerID="cdb0801cf44d2814be0de99ef023dba75db4f26bbfd0eddb9895bd7ef5fd56f1" exitCode=0 Mar 08 05:46:45 crc kubenswrapper[4717]: I0308 05:46:45.145091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" event={"ID":"3406523d-3819-494f-9270-c6ad58910d30","Type":"ContainerDied","Data":"cdb0801cf44d2814be0de99ef023dba75db4f26bbfd0eddb9895bd7ef5fd56f1"} Mar 08 05:46:45 crc kubenswrapper[4717]: I0308 05:46:45.717653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:45 crc kubenswrapper[4717]: E0308 05:46:45.718130 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 05:46:45 crc kubenswrapper[4717]: E0308 05:46:45.718175 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 05:46:45 crc kubenswrapper[4717]: E0308 05:46:45.718257 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift podName:67a11de8-b5e8-40d8-a451-1bece45918d8 nodeName:}" failed. No retries permitted until 2026-03-08 05:46:49.718229623 +0000 UTC m=+1236.635878487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift") pod "swift-storage-0" (UID: "67a11de8-b5e8-40d8-a451-1bece45918d8") : configmap "swift-ring-files" not found Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.117447 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.576966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.577011 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.688236 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fpb8l"] Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.689842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.696071 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.696529 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fpb8l"] Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.736370 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khsjw\" (UniqueName: \"kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.736708 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.838171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khsjw\" (UniqueName: \"kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.838240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.840418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:46 crc kubenswrapper[4717]: I0308 05:46:46.874267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khsjw\" (UniqueName: \"kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw\") pod \"root-account-create-update-fpb8l\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:47 crc kubenswrapper[4717]: I0308 05:46:47.042701 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:47 crc kubenswrapper[4717]: I0308 05:46:47.718674 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:47 crc kubenswrapper[4717]: I0308 05:46:47.719242 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="prometheus" containerID="cri-o://86d872f2255e9601628014092ab8d5a682407736b62dfc2349632a05be42f8be" gracePeriod=600 Mar 08 05:46:47 crc kubenswrapper[4717]: I0308 05:46:47.719348 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="thanos-sidecar" containerID="cri-o://f96498ca0245882030075fe2c8980ba30444b3170d792f5110267ec1c326806b" gracePeriod=600 Mar 08 05:46:47 crc kubenswrapper[4717]: I0308 05:46:47.719390 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="config-reloader" containerID="cri-o://2d97a3478ed536f81facdfe2b0853775e36e0c0937303da06b5778b97a7508c0" gracePeriod=600 Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169634 4717 generic.go:334] "Generic (PLEG): container finished" podID="d81933c3-0769-427a-a494-9cfd438d269d" containerID="f96498ca0245882030075fe2c8980ba30444b3170d792f5110267ec1c326806b" exitCode=0 Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169660 4717 generic.go:334] "Generic (PLEG): container finished" podID="d81933c3-0769-427a-a494-9cfd438d269d" containerID="2d97a3478ed536f81facdfe2b0853775e36e0c0937303da06b5778b97a7508c0" exitCode=0 Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169668 4717 generic.go:334] "Generic (PLEG): container finished" podID="d81933c3-0769-427a-a494-9cfd438d269d" containerID="86d872f2255e9601628014092ab8d5a682407736b62dfc2349632a05be42f8be" exitCode=0 Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerDied","Data":"f96498ca0245882030075fe2c8980ba30444b3170d792f5110267ec1c326806b"} Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerDied","Data":"2d97a3478ed536f81facdfe2b0853775e36e0c0937303da06b5778b97a7508c0"} Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.169742 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerDied","Data":"86d872f2255e9601628014092ab8d5a682407736b62dfc2349632a05be42f8be"} Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.557171 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671632 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671697 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671788 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671808 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.671842 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpnnv\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.672017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.672041 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.672079 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config\") pod \"d81933c3-0769-427a-a494-9cfd438d269d\" (UID: \"d81933c3-0769-427a-a494-9cfd438d269d\") " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.673383 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.673377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.675991 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.678072 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv" (OuterVolumeSpecName: "kube-api-access-wpnnv") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "kube-api-access-wpnnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.679285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.679409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.682035 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config" (OuterVolumeSpecName: "config") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.684342 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out" (OuterVolumeSpecName: "config-out") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.694731 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "pvc-6176dbd4-0abf-4276-942d-9f92f0510af7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.703055 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config" (OuterVolumeSpecName: "web-config") pod "d81933c3-0769-427a-a494-9cfd438d269d" (UID: "d81933c3-0769-427a-a494-9cfd438d269d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.736262 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fpb8l"] Mar 08 05:46:48 crc kubenswrapper[4717]: W0308 05:46:48.741398 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47425ac9_00c9_48b9_bd55_ef9b0d921491.slice/crio-8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6 WatchSource:0}: Error finding container 8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6: Status 404 returned error can't find the container with id 8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6 Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773675 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") on node \"crc\" " Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773725 4717 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773738 4717 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-web-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773747 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773759 4717 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773768 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773776 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d81933c3-0769-427a-a494-9cfd438d269d-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773786 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d81933c3-0769-427a-a494-9cfd438d269d-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773795 4717 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d81933c3-0769-427a-a494-9cfd438d269d-config-out\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.773803 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpnnv\" (UniqueName: \"kubernetes.io/projected/d81933c3-0769-427a-a494-9cfd438d269d-kube-api-access-wpnnv\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.797797 4717 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.797959 4717 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6176dbd4-0abf-4276-942d-9f92f0510af7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7") on node "crc" Mar 08 05:46:48 crc kubenswrapper[4717]: I0308 05:46:48.875834 4717 reconciler_common.go:293] "Volume detached for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.006958 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.162342 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.190716 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d81933c3-0769-427a-a494-9cfd438d269d","Type":"ContainerDied","Data":"ff886074339a045af341a73338f577c36aa46ce7b3fc23dbd9550cd5f8aed868"} Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.190772 4717 scope.go:117] "RemoveContainer" containerID="f96498ca0245882030075fe2c8980ba30444b3170d792f5110267ec1c326806b" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.190925 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.196919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bde07dc5-6141-42e2-b280-d4df5ebe3d61","Type":"ContainerStarted","Data":"ded30f04b8dbce25a88e9aad6c4dc4d31a8a6e633d3e3f10c6610a4ce5570974"} Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.207509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fpb8l" event={"ID":"47425ac9-00c9-48b9-bd55-ef9b0d921491","Type":"ContainerStarted","Data":"8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6"} Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.219920 4717 scope.go:117] "RemoveContainer" containerID="2d97a3478ed536f81facdfe2b0853775e36e0c0937303da06b5778b97a7508c0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.227608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" event={"ID":"3406523d-3819-494f-9270-c6ad58910d30","Type":"ContainerStarted","Data":"937e5048d54aa262d500d2a6e10bc62ace20ddcf53b37b8c957b4f4af650b273"} Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.228155 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.241872 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wc6z2" event={"ID":"03d2941c-7434-4961-a7ea-fdff878a1128","Type":"ContainerStarted","Data":"81f8dddefadf9338833053ed46e325f5c3790426f6c550ff9b0ae964202eb45d"} Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.252225 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" podStartSLOduration=9.252200986 podStartE2EDuration="9.252200986s" podCreationTimestamp="2026-03-08 05:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:46:49.250063604 +0000 UTC m=+1236.167712448" watchObservedRunningTime="2026-03-08 05:46:49.252200986 +0000 UTC m=+1236.169849840" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.286525 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wc6z2" podStartSLOduration=1.6983351180000001 podStartE2EDuration="7.286508271s" podCreationTimestamp="2026-03-08 05:46:42 +0000 UTC" firstStartedPulling="2026-03-08 05:46:42.708877486 +0000 UTC m=+1229.626526340" lastFinishedPulling="2026-03-08 05:46:48.297050649 +0000 UTC m=+1235.214699493" observedRunningTime="2026-03-08 05:46:49.265132425 +0000 UTC m=+1236.182781269" watchObservedRunningTime="2026-03-08 05:46:49.286508271 +0000 UTC m=+1236.204157115" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.331305 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.336434 4717 scope.go:117] "RemoveContainer" containerID="86d872f2255e9601628014092ab8d5a682407736b62dfc2349632a05be42f8be" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.339407 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.419462 4717 scope.go:117] "RemoveContainer" containerID="0d1fa0d6d0a849b60d021551aa1fab1224a09ede8f8e625b4b99600fa195f44a" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.428711 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.429417 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="config-reloader" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429431 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="config-reloader" Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.429451 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="init-config-reloader" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429458 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="init-config-reloader" Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.429497 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="prometheus" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429503 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="prometheus" Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.429516 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="thanos-sidecar" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429523 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="thanos-sidecar" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429849 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="prometheus" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429868 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="thanos-sidecar" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.429889 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81933c3-0769-427a-a494-9cfd438d269d" containerName="config-reloader" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.436016 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.489369 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.489565 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.489653 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.489784 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.490062 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.490165 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.490288 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.490386 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ckzwm" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.505640 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.512762 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.538392 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cc34-account-create-update-kcgjs"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.539506 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.544226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.573745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cc34-account-create-update-kcgjs"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.597318 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-z92m6"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.598434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603562 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtc5b\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603718 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603760 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603797 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603837 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603859 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.603908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.652729 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z92m6"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.684548 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-efc6-account-create-update-79h54"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.685677 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.688142 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.702045 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-efc6-account-create-update-79h54"] Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzsz\" (UniqueName: \"kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dwb\" (UniqueName: \"kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714581 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714636 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.714707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtc5b\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.715447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.726102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.727662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.731476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.733024 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.733050 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b9ef073caa6ec1ae4d35eecfe80ee2af5cbcdd85b8b9ead8efa911e24063287d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.738052 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.738161 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtc5b\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.739132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.740997 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.744344 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.746435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.747004 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.747373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.800482 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d81933c3-0769-427a-a494-9cfd438d269d" path="/var/lib/kubelet/pods/d81933c3-0769-427a-a494-9cfd438d269d/volumes" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.810009 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816091 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816287 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzsz\" (UniqueName: \"kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7649g\" (UniqueName: \"kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816352 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.816375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dwb\" (UniqueName: \"kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.816942 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.816960 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 05:46:49 crc kubenswrapper[4717]: E0308 05:46:49.816994 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift podName:67a11de8-b5e8-40d8-a451-1bece45918d8 nodeName:}" failed. No retries permitted until 2026-03-08 05:46:57.816980282 +0000 UTC m=+1244.734629126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift") pod "swift-storage-0" (UID: "67a11de8-b5e8-40d8-a451-1bece45918d8") : configmap "swift-ring-files" not found Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.817580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.817807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.832350 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dwb\" (UniqueName: \"kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb\") pod \"placement-db-create-z92m6\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " pod="openstack/placement-db-create-z92m6" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.835282 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzsz\" (UniqueName: \"kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz\") pod \"keystone-cc34-account-create-update-kcgjs\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.918379 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.918531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7649g\" (UniqueName: \"kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.919187 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.934902 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7649g\" (UniqueName: \"kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g\") pod \"placement-efc6-account-create-update-79h54\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.956084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 05:46:49 crc kubenswrapper[4717]: I0308 05:46:49.985671 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.014027 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z92m6" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.027398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.253808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bde07dc5-6141-42e2-b280-d4df5ebe3d61","Type":"ContainerStarted","Data":"ef51f3a648124caefc3d56a08dc5e91794b21a5f9dd46e63cb4c93b49d632a9e"} Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.254050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bde07dc5-6141-42e2-b280-d4df5ebe3d61","Type":"ContainerStarted","Data":"8b84ed6f973a7b2ecdc796bffe350eb01334731703505e2024296eb9e09ab744"} Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.254090 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.262994 4717 generic.go:334] "Generic (PLEG): container finished" podID="47425ac9-00c9-48b9-bd55-ef9b0d921491" containerID="f8f39680a411ecd5f389c31818c196277ab7a372cccccbd8c096d4ced38e9d4f" exitCode=0 Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.263074 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fpb8l" event={"ID":"47425ac9-00c9-48b9-bd55-ef9b0d921491","Type":"ContainerDied","Data":"f8f39680a411ecd5f389c31818c196277ab7a372cccccbd8c096d4ced38e9d4f"} Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.275683 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.360913723 podStartE2EDuration="6.275658806s" podCreationTimestamp="2026-03-08 05:46:44 +0000 UTC" firstStartedPulling="2026-03-08 05:46:48.224149774 +0000 UTC m=+1235.141798638" lastFinishedPulling="2026-03-08 05:46:49.138894877 +0000 UTC m=+1236.056543721" observedRunningTime="2026-03-08 05:46:50.270484799 +0000 UTC m=+1237.188133653" watchObservedRunningTime="2026-03-08 05:46:50.275658806 +0000 UTC m=+1237.193307650" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.458536 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.622129 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z92m6"] Mar 08 05:46:50 crc kubenswrapper[4717]: W0308 05:46:50.627004 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23475f99_47f9_4533_9a3d_c8a024f6bfdb.slice/crio-6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5 WatchSource:0}: Error finding container 6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5: Status 404 returned error can't find the container with id 6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5 Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.641537 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cc34-account-create-update-kcgjs"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.648866 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-efc6-account-create-update-79h54"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.788464 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-4wvld"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.791608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.798947 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-ee16-account-create-update-nz4f6"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.800134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.805169 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.817922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-4wvld"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.845469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-ee16-account-create-update-nz4f6"] Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.940079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcsjg\" (UniqueName: \"kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.940142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.941371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkb4h\" (UniqueName: \"kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:50 crc kubenswrapper[4717]: I0308 05:46:50.941479 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.043184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcsjg\" (UniqueName: \"kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.043238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.043277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkb4h\" (UniqueName: \"kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.043318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.044069 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.044772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.061251 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcsjg\" (UniqueName: \"kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg\") pod \"watcher-db-create-4wvld\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.063156 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkb4h\" (UniqueName: \"kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h\") pod \"watcher-ee16-account-create-update-nz4f6\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.122458 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:51 crc kubenswrapper[4717]: I0308 05:46:51.131881 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.304940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerStarted","Data":"cfb8ef99e074569a08fab4e51da9e9a93487b4e588524137310649d1ea370f17"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.308188 4717 generic.go:334] "Generic (PLEG): container finished" podID="23475f99-47f9-4533-9a3d-c8a024f6bfdb" containerID="255cfec37cbcfaa183ff2967bc16799020f26ac8d36e31e90a93cf71423f99a0" exitCode=0 Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.308273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z92m6" event={"ID":"23475f99-47f9-4533-9a3d-c8a024f6bfdb","Type":"ContainerDied","Data":"255cfec37cbcfaa183ff2967bc16799020f26ac8d36e31e90a93cf71423f99a0"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.308330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z92m6" event={"ID":"23475f99-47f9-4533-9a3d-c8a024f6bfdb","Type":"ContainerStarted","Data":"6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.310651 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e74691e-98a9-4c29-9c1f-8ce4e5788f35" containerID="1c1c7255770cb151222afda0c8d5fe610519c563259e929df10847e4916c8c39" exitCode=0 Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.310706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cc34-account-create-update-kcgjs" event={"ID":"9e74691e-98a9-4c29-9c1f-8ce4e5788f35","Type":"ContainerDied","Data":"1c1c7255770cb151222afda0c8d5fe610519c563259e929df10847e4916c8c39"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.310736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cc34-account-create-update-kcgjs" event={"ID":"9e74691e-98a9-4c29-9c1f-8ce4e5788f35","Type":"ContainerStarted","Data":"bba4a518a701ef19fd9cfd4c063cf926c755e79003f6430d856b0a179ba8988f"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.318101 4717 generic.go:334] "Generic (PLEG): container finished" podID="22ef4c0a-9284-45bb-9319-21b75e4e1327" containerID="a408b0cf1826d25536b15f2d3c86a71de3d0646fc1cee3d534dab3694024215a" exitCode=0 Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.319067 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efc6-account-create-update-79h54" event={"ID":"22ef4c0a-9284-45bb-9319-21b75e4e1327","Type":"ContainerDied","Data":"a408b0cf1826d25536b15f2d3c86a71de3d0646fc1cee3d534dab3694024215a"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:51.319090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efc6-account-create-update-79h54" event={"ID":"22ef4c0a-9284-45bb-9319-21b75e4e1327","Type":"ContainerStarted","Data":"4ebdc27c7bcd1d1a368dd8194de2134425e318d4a28abce80d31ba3d44f91da2"} Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.426437 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-4wvld"] Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.575502 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.609929 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-ee16-account-create-update-nz4f6"] Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.682869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khsjw\" (UniqueName: \"kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw\") pod \"47425ac9-00c9-48b9-bd55-ef9b0d921491\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.682961 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts\") pod \"47425ac9-00c9-48b9-bd55-ef9b0d921491\" (UID: \"47425ac9-00c9-48b9-bd55-ef9b0d921491\") " Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.683811 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47425ac9-00c9-48b9-bd55-ef9b0d921491" (UID: "47425ac9-00c9-48b9-bd55-ef9b0d921491"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.689516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw" (OuterVolumeSpecName: "kube-api-access-khsjw") pod "47425ac9-00c9-48b9-bd55-ef9b0d921491" (UID: "47425ac9-00c9-48b9-bd55-ef9b0d921491"). InnerVolumeSpecName "kube-api-access-khsjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.785297 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khsjw\" (UniqueName: \"kubernetes.io/projected/47425ac9-00c9-48b9-bd55-ef9b0d921491-kube-api-access-khsjw\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.785330 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47425ac9-00c9-48b9-bd55-ef9b0d921491-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.889883 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z92m6" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.970998 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.980895 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.989781 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4dwb\" (UniqueName: \"kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb\") pod \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.989955 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts\") pod \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\" (UID: \"23475f99-47f9-4533-9a3d-c8a024f6bfdb\") " Mar 08 05:46:52 crc kubenswrapper[4717]: I0308 05:46:52.990934 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23475f99-47f9-4533-9a3d-c8a024f6bfdb" (UID: "23475f99-47f9-4533-9a3d-c8a024f6bfdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.093136 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts\") pod \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.093185 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzsz\" (UniqueName: \"kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz\") pod \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\" (UID: \"9e74691e-98a9-4c29-9c1f-8ce4e5788f35\") " Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.093205 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts\") pod \"22ef4c0a-9284-45bb-9319-21b75e4e1327\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.093220 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7649g\" (UniqueName: \"kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g\") pod \"22ef4c0a-9284-45bb-9319-21b75e4e1327\" (UID: \"22ef4c0a-9284-45bb-9319-21b75e4e1327\") " Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.093581 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23475f99-47f9-4533-9a3d-c8a024f6bfdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.094315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22ef4c0a-9284-45bb-9319-21b75e4e1327" (UID: "22ef4c0a-9284-45bb-9319-21b75e4e1327"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.094346 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e74691e-98a9-4c29-9c1f-8ce4e5788f35" (UID: "9e74691e-98a9-4c29-9c1f-8ce4e5788f35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.181089 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g" (OuterVolumeSpecName: "kube-api-access-7649g") pod "22ef4c0a-9284-45bb-9319-21b75e4e1327" (UID: "22ef4c0a-9284-45bb-9319-21b75e4e1327"). InnerVolumeSpecName "kube-api-access-7649g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.181164 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz" (OuterVolumeSpecName: "kube-api-access-bkzsz") pod "9e74691e-98a9-4c29-9c1f-8ce4e5788f35" (UID: "9e74691e-98a9-4c29-9c1f-8ce4e5788f35"). InnerVolumeSpecName "kube-api-access-bkzsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.181590 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb" (OuterVolumeSpecName: "kube-api-access-w4dwb") pod "23475f99-47f9-4533-9a3d-c8a024f6bfdb" (UID: "23475f99-47f9-4533-9a3d-c8a024f6bfdb"). InnerVolumeSpecName "kube-api-access-w4dwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.204836 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4dwb\" (UniqueName: \"kubernetes.io/projected/23475f99-47f9-4533-9a3d-c8a024f6bfdb-kube-api-access-w4dwb\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.205188 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.205272 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkzsz\" (UniqueName: \"kubernetes.io/projected/9e74691e-98a9-4c29-9c1f-8ce4e5788f35-kube-api-access-bkzsz\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.205461 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ef4c0a-9284-45bb-9319-21b75e4e1327-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.205538 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7649g\" (UniqueName: \"kubernetes.io/projected/22ef4c0a-9284-45bb-9319-21b75e4e1327-kube-api-access-7649g\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.335199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ee16-account-create-update-nz4f6" event={"ID":"bfe0279d-2b57-4bf8-bf59-57801e535420","Type":"ContainerStarted","Data":"ec91b22d592cf245308f4999c233f8e985fcb6a4481a7aa80b7ca073e6e9c605"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.335238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ee16-account-create-update-nz4f6" event={"ID":"bfe0279d-2b57-4bf8-bf59-57801e535420","Type":"ContainerStarted","Data":"807e13c7988cee02b367be1ac5e5e8d0c5830352b86b049bd7bc552bcb61115c"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.339805 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-4wvld" event={"ID":"e3322e2d-06fd-4728-b09c-e03982c5afc0","Type":"ContainerStarted","Data":"88a5f3e61553288b62237bd90b5208a54c45283b46c95004d70bece082206615"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.339848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-4wvld" event={"ID":"e3322e2d-06fd-4728-b09c-e03982c5afc0","Type":"ContainerStarted","Data":"229e2d3834b329b95ef8a754bd111302b794e695825186ceb2bd53c672465c6f"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.343778 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z92m6" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.343800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z92m6" event={"ID":"23475f99-47f9-4533-9a3d-c8a024f6bfdb","Type":"ContainerDied","Data":"6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.343849 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6544f5f4e8b725fecaebcfead9d41d5c254c92a3986068e4914e495facebd3a5" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.348580 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cc34-account-create-update-kcgjs" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.348664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cc34-account-create-update-kcgjs" event={"ID":"9e74691e-98a9-4c29-9c1f-8ce4e5788f35","Type":"ContainerDied","Data":"bba4a518a701ef19fd9cfd4c063cf926c755e79003f6430d856b0a179ba8988f"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.348720 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba4a518a701ef19fd9cfd4c063cf926c755e79003f6430d856b0a179ba8988f" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.355770 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-ee16-account-create-update-nz4f6" podStartSLOduration=3.355753183 podStartE2EDuration="3.355753183s" podCreationTimestamp="2026-03-08 05:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:46:53.353974909 +0000 UTC m=+1240.271623753" watchObservedRunningTime="2026-03-08 05:46:53.355753183 +0000 UTC m=+1240.273402027" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.388276 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-4wvld" podStartSLOduration=3.388258533 podStartE2EDuration="3.388258533s" podCreationTimestamp="2026-03-08 05:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:46:53.387161386 +0000 UTC m=+1240.304810240" watchObservedRunningTime="2026-03-08 05:46:53.388258533 +0000 UTC m=+1240.305907377" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.401333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-efc6-account-create-update-79h54" event={"ID":"22ef4c0a-9284-45bb-9319-21b75e4e1327","Type":"ContainerDied","Data":"4ebdc27c7bcd1d1a368dd8194de2134425e318d4a28abce80d31ba3d44f91da2"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.401423 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ebdc27c7bcd1d1a368dd8194de2134425e318d4a28abce80d31ba3d44f91da2" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.401377 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-efc6-account-create-update-79h54" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.407253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fpb8l" event={"ID":"47425ac9-00c9-48b9-bd55-ef9b0d921491","Type":"ContainerDied","Data":"8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6"} Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.407308 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adce2d050b1930b5e47ec7708bccc61be4feca95914e653b6e87f3b554e12f6" Mar 08 05:46:53 crc kubenswrapper[4717]: I0308 05:46:53.407428 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fpb8l" Mar 08 05:46:54 crc kubenswrapper[4717]: I0308 05:46:54.424066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerStarted","Data":"da9e55b4b4db08e5866e96ab67da58b228ca3ddb7a01e67275e972c7d3dbf661"} Mar 08 05:46:54 crc kubenswrapper[4717]: I0308 05:46:54.428869 4717 generic.go:334] "Generic (PLEG): container finished" podID="e3322e2d-06fd-4728-b09c-e03982c5afc0" containerID="88a5f3e61553288b62237bd90b5208a54c45283b46c95004d70bece082206615" exitCode=0 Mar 08 05:46:54 crc kubenswrapper[4717]: I0308 05:46:54.428893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-4wvld" event={"ID":"e3322e2d-06fd-4728-b09c-e03982c5afc0","Type":"ContainerDied","Data":"88a5f3e61553288b62237bd90b5208a54c45283b46c95004d70bece082206615"} Mar 08 05:46:54 crc kubenswrapper[4717]: I0308 05:46:54.431474 4717 generic.go:334] "Generic (PLEG): container finished" podID="bfe0279d-2b57-4bf8-bf59-57801e535420" containerID="ec91b22d592cf245308f4999c233f8e985fcb6a4481a7aa80b7ca073e6e9c605" exitCode=0 Mar 08 05:46:54 crc kubenswrapper[4717]: I0308 05:46:54.431518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ee16-account-create-update-nz4f6" event={"ID":"bfe0279d-2b57-4bf8-bf59-57801e535420","Type":"ContainerDied","Data":"ec91b22d592cf245308f4999c233f8e985fcb6a4481a7aa80b7ca073e6e9c605"} Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.273117 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fpb8l"] Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.290925 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fpb8l"] Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.350557 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4zg55"] Mar 08 05:46:55 crc kubenswrapper[4717]: E0308 05:46:55.350995 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ef4c0a-9284-45bb-9319-21b75e4e1327" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351021 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ef4c0a-9284-45bb-9319-21b75e4e1327" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: E0308 05:46:55.351042 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47425ac9-00c9-48b9-bd55-ef9b0d921491" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351052 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="47425ac9-00c9-48b9-bd55-ef9b0d921491" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: E0308 05:46:55.351070 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e74691e-98a9-4c29-9c1f-8ce4e5788f35" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351078 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e74691e-98a9-4c29-9c1f-8ce4e5788f35" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: E0308 05:46:55.351095 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23475f99-47f9-4533-9a3d-c8a024f6bfdb" containerName="mariadb-database-create" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351103 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="23475f99-47f9-4533-9a3d-c8a024f6bfdb" containerName="mariadb-database-create" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351307 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="23475f99-47f9-4533-9a3d-c8a024f6bfdb" containerName="mariadb-database-create" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351327 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e74691e-98a9-4c29-9c1f-8ce4e5788f35" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351337 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ef4c0a-9284-45bb-9319-21b75e4e1327" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351359 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="47425ac9-00c9-48b9-bd55-ef9b0d921491" containerName="mariadb-account-create-update" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.351980 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.354324 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.369827 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4zg55"] Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.447931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.448568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t7h2\" (UniqueName: \"kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.551849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.552802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.553254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t7h2\" (UniqueName: \"kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.602034 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t7h2\" (UniqueName: \"kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2\") pod \"root-account-create-update-4zg55\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.690199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.809730 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47425ac9-00c9-48b9-bd55-ef9b0d921491" path="/var/lib/kubelet/pods/47425ac9-00c9-48b9-bd55-ef9b0d921491/volumes" Mar 08 05:46:55 crc kubenswrapper[4717]: I0308 05:46:55.946741 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.018012 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.065256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts\") pod \"e3322e2d-06fd-4728-b09c-e03982c5afc0\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.065391 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcsjg\" (UniqueName: \"kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg\") pod \"e3322e2d-06fd-4728-b09c-e03982c5afc0\" (UID: \"e3322e2d-06fd-4728-b09c-e03982c5afc0\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.066175 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3322e2d-06fd-4728-b09c-e03982c5afc0" (UID: "e3322e2d-06fd-4728-b09c-e03982c5afc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.068293 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3322e2d-06fd-4728-b09c-e03982c5afc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.076944 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg" (OuterVolumeSpecName: "kube-api-access-xcsjg") pod "e3322e2d-06fd-4728-b09c-e03982c5afc0" (UID: "e3322e2d-06fd-4728-b09c-e03982c5afc0"). InnerVolumeSpecName "kube-api-access-xcsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.169924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkb4h\" (UniqueName: \"kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h\") pod \"bfe0279d-2b57-4bf8-bf59-57801e535420\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.169967 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts\") pod \"bfe0279d-2b57-4bf8-bf59-57801e535420\" (UID: \"bfe0279d-2b57-4bf8-bf59-57801e535420\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.170271 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcsjg\" (UniqueName: \"kubernetes.io/projected/e3322e2d-06fd-4728-b09c-e03982c5afc0-kube-api-access-xcsjg\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.170623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bfe0279d-2b57-4bf8-bf59-57801e535420" (UID: "bfe0279d-2b57-4bf8-bf59-57801e535420"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.173176 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h" (OuterVolumeSpecName: "kube-api-access-xkb4h") pod "bfe0279d-2b57-4bf8-bf59-57801e535420" (UID: "bfe0279d-2b57-4bf8-bf59-57801e535420"). InnerVolumeSpecName "kube-api-access-xkb4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.192156 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.239845 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.240057 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="dnsmasq-dns" containerID="cri-o://b94ec05f86cc041aa2434a2cf16033f0b4de1619b5baae7bbe9757bc75f8430a" gracePeriod=10 Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.271544 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkb4h\" (UniqueName: \"kubernetes.io/projected/bfe0279d-2b57-4bf8-bf59-57801e535420-kube-api-access-xkb4h\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.271568 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe0279d-2b57-4bf8-bf59-57801e535420-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.320167 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4zg55"] Mar 08 05:46:56 crc kubenswrapper[4717]: W0308 05:46:56.412760 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1355028_71db_4c23_8219_4401a80e2902.slice/crio-8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1 WatchSource:0}: Error finding container 8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1: Status 404 returned error can't find the container with id 8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1 Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.453150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ee16-account-create-update-nz4f6" event={"ID":"bfe0279d-2b57-4bf8-bf59-57801e535420","Type":"ContainerDied","Data":"807e13c7988cee02b367be1ac5e5e8d0c5830352b86b049bd7bc552bcb61115c"} Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.453197 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807e13c7988cee02b367be1ac5e5e8d0c5830352b86b049bd7bc552bcb61115c" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.453265 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ee16-account-create-update-nz4f6" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.467574 4717 generic.go:334] "Generic (PLEG): container finished" podID="03d2941c-7434-4961-a7ea-fdff878a1128" containerID="81f8dddefadf9338833053ed46e325f5c3790426f6c550ff9b0ae964202eb45d" exitCode=0 Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.467744 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wc6z2" event={"ID":"03d2941c-7434-4961-a7ea-fdff878a1128","Type":"ContainerDied","Data":"81f8dddefadf9338833053ed46e325f5c3790426f6c550ff9b0ae964202eb45d"} Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.470244 4717 generic.go:334] "Generic (PLEG): container finished" podID="22841d35-043a-442d-beca-decc3c750d66" containerID="b94ec05f86cc041aa2434a2cf16033f0b4de1619b5baae7bbe9757bc75f8430a" exitCode=0 Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.470316 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" event={"ID":"22841d35-043a-442d-beca-decc3c750d66","Type":"ContainerDied","Data":"b94ec05f86cc041aa2434a2cf16033f0b4de1619b5baae7bbe9757bc75f8430a"} Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.474144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4zg55" event={"ID":"e1355028-71db-4c23-8219-4401a80e2902","Type":"ContainerStarted","Data":"8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1"} Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.475402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-4wvld" event={"ID":"e3322e2d-06fd-4728-b09c-e03982c5afc0","Type":"ContainerDied","Data":"229e2d3834b329b95ef8a754bd111302b794e695825186ceb2bd53c672465c6f"} Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.475425 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229e2d3834b329b95ef8a754bd111302b794e695825186ceb2bd53c672465c6f" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.475475 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-4wvld" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.768884 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.880648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqmsb\" (UniqueName: \"kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb\") pod \"22841d35-043a-442d-beca-decc3c750d66\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.880731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb\") pod \"22841d35-043a-442d-beca-decc3c750d66\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.880786 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config\") pod \"22841d35-043a-442d-beca-decc3c750d66\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.880868 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc\") pod \"22841d35-043a-442d-beca-decc3c750d66\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.880932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb\") pod \"22841d35-043a-442d-beca-decc3c750d66\" (UID: \"22841d35-043a-442d-beca-decc3c750d66\") " Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.888186 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb" (OuterVolumeSpecName: "kube-api-access-fqmsb") pod "22841d35-043a-442d-beca-decc3c750d66" (UID: "22841d35-043a-442d-beca-decc3c750d66"). InnerVolumeSpecName "kube-api-access-fqmsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.924158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config" (OuterVolumeSpecName: "config") pod "22841d35-043a-442d-beca-decc3c750d66" (UID: "22841d35-043a-442d-beca-decc3c750d66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.924759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22841d35-043a-442d-beca-decc3c750d66" (UID: "22841d35-043a-442d-beca-decc3c750d66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.937119 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22841d35-043a-442d-beca-decc3c750d66" (UID: "22841d35-043a-442d-beca-decc3c750d66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.938378 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22841d35-043a-442d-beca-decc3c750d66" (UID: "22841d35-043a-442d-beca-decc3c750d66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.982912 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.982952 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqmsb\" (UniqueName: \"kubernetes.io/projected/22841d35-043a-442d-beca-decc3c750d66-kube-api-access-fqmsb\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.982966 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.982979 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:56 crc kubenswrapper[4717]: I0308 05:46:56.982991 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22841d35-043a-442d-beca-decc3c750d66-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.487043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" event={"ID":"22841d35-043a-442d-beca-decc3c750d66","Type":"ContainerDied","Data":"6a1aacfd359257a40e1f11e9ac5f1cdb63497b07b873f898ff8b4f0915e0d139"} Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.487098 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.487124 4717 scope.go:117] "RemoveContainer" containerID="b94ec05f86cc041aa2434a2cf16033f0b4de1619b5baae7bbe9757bc75f8430a" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.491553 4717 generic.go:334] "Generic (PLEG): container finished" podID="e1355028-71db-4c23-8219-4401a80e2902" containerID="b6b97e7aa1da0b0cd74f5b9174ef3ad4c02d3e79f5545e8f1424a57c60a08ac9" exitCode=0 Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.491590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4zg55" event={"ID":"e1355028-71db-4c23-8219-4401a80e2902","Type":"ContainerDied","Data":"b6b97e7aa1da0b0cd74f5b9174ef3ad4c02d3e79f5545e8f1424a57c60a08ac9"} Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.529848 4717 scope.go:117] "RemoveContainer" containerID="63b10a21e9f5b3ee560412875559e6b2d813d1c6fcd3c3d7b5f67eacb171079e" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.561944 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.569269 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dc8b7f6fc-2jcgc"] Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.791258 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22841d35-043a-442d-beca-decc3c750d66" path="/var/lib/kubelet/pods/22841d35-043a-442d-beca-decc3c750d66/volumes" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.844890 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.897754 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.902457 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67a11de8-b5e8-40d8-a451-1bece45918d8-etc-swift\") pod \"swift-storage-0\" (UID: \"67a11de8-b5e8-40d8-a451-1bece45918d8\") " pod="openstack/swift-storage-0" Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999262 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999363 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999435 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999519 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:57 crc kubenswrapper[4717]: I0308 05:46:57.999579 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle\") pod \"03d2941c-7434-4961-a7ea-fdff878a1128\" (UID: \"03d2941c-7434-4961-a7ea-fdff878a1128\") " Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.000052 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.000528 4717 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.001061 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.004827 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd" (OuterVolumeSpecName: "kube-api-access-z6qtd") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "kube-api-access-z6qtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.008793 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.018625 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts" (OuterVolumeSpecName: "scripts") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.038066 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.042842 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "03d2941c-7434-4961-a7ea-fdff878a1128" (UID: "03d2941c-7434-4961-a7ea-fdff878a1128"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102819 4717 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102872 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/03d2941c-7434-4961-a7ea-fdff878a1128-kube-api-access-z6qtd\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102894 4717 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102913 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03d2941c-7434-4961-a7ea-fdff878a1128-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102931 4717 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/03d2941c-7434-4961-a7ea-fdff878a1128-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.102949 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d2941c-7434-4961-a7ea-fdff878a1128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.179148 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.502295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wc6z2" event={"ID":"03d2941c-7434-4961-a7ea-fdff878a1128","Type":"ContainerDied","Data":"2711bab0d4969ef6a9200460d2834af9c2e19b5b6f299b298402190a716bf0c3"} Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.502342 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2711bab0d4969ef6a9200460d2834af9c2e19b5b6f299b298402190a716bf0c3" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.502361 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wc6z2" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.568980 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-t4wh5"] Mar 08 05:46:58 crc kubenswrapper[4717]: E0308 05:46:58.569375 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d2941c-7434-4961-a7ea-fdff878a1128" containerName="swift-ring-rebalance" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.569393 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d2941c-7434-4961-a7ea-fdff878a1128" containerName="swift-ring-rebalance" Mar 08 05:46:58 crc kubenswrapper[4717]: E0308 05:46:58.569411 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="dnsmasq-dns" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.569419 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="dnsmasq-dns" Mar 08 05:46:58 crc kubenswrapper[4717]: E0308 05:46:58.569438 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3322e2d-06fd-4728-b09c-e03982c5afc0" containerName="mariadb-database-create" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.569447 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3322e2d-06fd-4728-b09c-e03982c5afc0" containerName="mariadb-database-create" Mar 08 05:46:58 crc kubenswrapper[4717]: E0308 05:46:58.569457 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="init" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.569465 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="init" Mar 08 05:46:58 crc kubenswrapper[4717]: E0308 05:46:58.569488 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe0279d-2b57-4bf8-bf59-57801e535420" containerName="mariadb-account-create-update" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.569496 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe0279d-2b57-4bf8-bf59-57801e535420" containerName="mariadb-account-create-update" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.571062 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="22841d35-043a-442d-beca-decc3c750d66" containerName="dnsmasq-dns" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.571085 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe0279d-2b57-4bf8-bf59-57801e535420" containerName="mariadb-account-create-update" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.571110 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d2941c-7434-4961-a7ea-fdff878a1128" containerName="swift-ring-rebalance" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.571123 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3322e2d-06fd-4728-b09c-e03982c5afc0" containerName="mariadb-database-create" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.571642 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.577417 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-t4wh5"] Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.699965 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9f40-account-create-update-bhbfw"] Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.701733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.704931 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.720106 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx5wn\" (UniqueName: \"kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.720224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.741225 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9f40-account-create-update-bhbfw"] Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.821553 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx5wn\" (UniqueName: \"kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.821625 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.821713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.821769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fq9\" (UniqueName: \"kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.822960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.866947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx5wn\" (UniqueName: \"kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn\") pod \"glance-db-create-t4wh5\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.923800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.923853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fq9\" (UniqueName: \"kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.925289 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.942779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fq9\" (UniqueName: \"kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9\") pod \"glance-9f40-account-create-update-bhbfw\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:58 crc kubenswrapper[4717]: I0308 05:46:58.973545 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-t4wh5" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.009569 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.032505 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:46:59 crc kubenswrapper[4717]: W0308 05:46:59.034442 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a11de8_b5e8_40d8_a451_1bece45918d8.slice/crio-043af47b5c4d5b82d8cb60ddc0e4f93ebe0a7283891dcb7daf610abc9f039b4a WatchSource:0}: Error finding container 043af47b5c4d5b82d8cb60ddc0e4f93ebe0a7283891dcb7daf610abc9f039b4a: Status 404 returned error can't find the container with id 043af47b5c4d5b82d8cb60ddc0e4f93ebe0a7283891dcb7daf610abc9f039b4a Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.036792 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.044320 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4c5fb" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.060646 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.227534 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t7h2\" (UniqueName: \"kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2\") pod \"e1355028-71db-4c23-8219-4401a80e2902\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.228139 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts\") pod \"e1355028-71db-4c23-8219-4401a80e2902\" (UID: \"e1355028-71db-4c23-8219-4401a80e2902\") " Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.230667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1355028-71db-4c23-8219-4401a80e2902" (UID: "e1355028-71db-4c23-8219-4401a80e2902"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.241184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2" (OuterVolumeSpecName: "kube-api-access-2t7h2") pod "e1355028-71db-4c23-8219-4401a80e2902" (UID: "e1355028-71db-4c23-8219-4401a80e2902"). InnerVolumeSpecName "kube-api-access-2t7h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.248286 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mcfsn-config-jkbl6"] Mar 08 05:46:59 crc kubenswrapper[4717]: E0308 05:46:59.248765 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1355028-71db-4c23-8219-4401a80e2902" containerName="mariadb-account-create-update" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.248789 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1355028-71db-4c23-8219-4401a80e2902" containerName="mariadb-account-create-update" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.248953 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1355028-71db-4c23-8219-4401a80e2902" containerName="mariadb-account-create-update" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.249586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.252086 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.264011 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mcfsn-config-jkbl6"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.298649 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-smbl2"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.299841 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.307567 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-smbl2"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.329702 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t7h2\" (UniqueName: \"kubernetes.io/projected/e1355028-71db-4c23-8219-4401a80e2902-kube-api-access-2t7h2\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.329726 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1355028-71db-4c23-8219-4401a80e2902-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.430840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcw75\" (UniqueName: \"kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.430916 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.430937 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.431036 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.431064 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.431153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89br\" (UniqueName: \"kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.431227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.431285 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.515343 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4zg55" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.515336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4zg55" event={"ID":"e1355028-71db-4c23-8219-4401a80e2902","Type":"ContainerDied","Data":"8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1"} Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.515455 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa9f1962eeeb55443c1a5c8155b2a608df5f85febf78b03108c46c1d643a0d1" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.516763 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"043af47b5c4d5b82d8cb60ddc0e4f93ebe0a7283891dcb7daf610abc9f039b4a"} Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.516817 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-t4wh5"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89br\" (UniqueName: \"kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcw75\" (UniqueName: \"kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532805 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.532972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.533134 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.533768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.533778 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.533866 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.536835 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.548512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89br\" (UniqueName: \"kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br\") pod \"keystone-db-create-smbl2\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " pod="openstack/keystone-db-create-smbl2" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.550570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcw75\" (UniqueName: \"kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75\") pod \"ovn-controller-mcfsn-config-jkbl6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.577130 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.583421 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9f40-account-create-update-bhbfw"] Mar 08 05:46:59 crc kubenswrapper[4717]: I0308 05:46:59.627511 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smbl2" Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.484357 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-smbl2"] Mar 08 05:47:00 crc kubenswrapper[4717]: W0308 05:47:00.486133 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d77060_47d4_40ad_b1c1_5000a1513aaa.slice/crio-d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30 WatchSource:0}: Error finding container d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30: Status 404 returned error can't find the container with id d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30 Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.527066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9f40-account-create-update-bhbfw" event={"ID":"2a07c895-406b-426a-b60a-545e0dced812","Type":"ContainerStarted","Data":"008c39892611d17ac1115e163ddbdd097cf110a7d4cc4e20295f38f134bc360f"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.527108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9f40-account-create-update-bhbfw" event={"ID":"2a07c895-406b-426a-b60a-545e0dced812","Type":"ContainerStarted","Data":"e178d30e806a4fb5411409aeca5ada7bfa294be75c32713e13b28f46453fd743"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.529431 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"3f379b1243a5b074d7cdeb3e8194a0ecd18041181ffa42abe863a3a577952ea5"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.530704 4717 generic.go:334] "Generic (PLEG): container finished" podID="c576697c-a1be-47aa-bf8f-902435b2af04" containerID="088d3cbf4a68a9371e3d9737a7d7a0e8d8fa4df911be2a27196944665c99232d" exitCode=0 Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.530741 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-t4wh5" event={"ID":"c576697c-a1be-47aa-bf8f-902435b2af04","Type":"ContainerDied","Data":"088d3cbf4a68a9371e3d9737a7d7a0e8d8fa4df911be2a27196944665c99232d"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.530756 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-t4wh5" event={"ID":"c576697c-a1be-47aa-bf8f-902435b2af04","Type":"ContainerStarted","Data":"3e97cdd25b6adc36276683aa7aa476e519e3bd86e618cffe153b06b88f7d0767"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.532044 4717 generic.go:334] "Generic (PLEG): container finished" podID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerID="da9e55b4b4db08e5866e96ab67da58b228ca3ddb7a01e67275e972c7d3dbf661" exitCode=0 Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.532079 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerDied","Data":"da9e55b4b4db08e5866e96ab67da58b228ca3ddb7a01e67275e972c7d3dbf661"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.535949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smbl2" event={"ID":"a6d77060-47d4-40ad-b1c1-5000a1513aaa","Type":"ContainerStarted","Data":"d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30"} Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.548371 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9f40-account-create-update-bhbfw" podStartSLOduration=2.548356308 podStartE2EDuration="2.548356308s" podCreationTimestamp="2026-03-08 05:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:00.543426686 +0000 UTC m=+1247.461075530" watchObservedRunningTime="2026-03-08 05:47:00.548356308 +0000 UTC m=+1247.466005152" Mar 08 05:47:00 crc kubenswrapper[4717]: I0308 05:47:00.612508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mcfsn-config-jkbl6"] Mar 08 05:47:00 crc kubenswrapper[4717]: W0308 05:47:00.623801 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f49e0f8_0661_49bb_a6c0_30365a1398f6.slice/crio-b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a WatchSource:0}: Error finding container b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a: Status 404 returned error can't find the container with id b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.546758 4717 generic.go:334] "Generic (PLEG): container finished" podID="a6d77060-47d4-40ad-b1c1-5000a1513aaa" containerID="d561e3606294bb7c359a0512789d0bc6239139bee9a05e2754a178f549f63624" exitCode=0 Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.546920 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smbl2" event={"ID":"a6d77060-47d4-40ad-b1c1-5000a1513aaa","Type":"ContainerDied","Data":"d561e3606294bb7c359a0512789d0bc6239139bee9a05e2754a178f549f63624"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.550926 4717 generic.go:334] "Generic (PLEG): container finished" podID="2a07c895-406b-426a-b60a-545e0dced812" containerID="008c39892611d17ac1115e163ddbdd097cf110a7d4cc4e20295f38f134bc360f" exitCode=0 Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.551031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9f40-account-create-update-bhbfw" event={"ID":"2a07c895-406b-426a-b60a-545e0dced812","Type":"ContainerDied","Data":"008c39892611d17ac1115e163ddbdd097cf110a7d4cc4e20295f38f134bc360f"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.554433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"689557594a3d31a29e7cfd14c6b1c1f1714ba725e53630c93f9b9734233264fc"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.554470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"7dd3557b9ad02cbedef7a946ecc230b1b1ce450eb1a3de96344c594c663d7004"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.554484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"89eeed73c170c82607a388595c64f32678fae93239363215cfd50438225766e0"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.557468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerStarted","Data":"344bfe7e408c4beb924682b8868feaac80462ff38b202cbffdf9c41a89abb55c"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.567627 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn-config-jkbl6" event={"ID":"1f49e0f8-0661-49bb-a6c0-30365a1398f6","Type":"ContainerStarted","Data":"9c3a930dcf3788ab3fffa14759e926c172c7fce0734dcf7a0de30f8e282b88d1"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.567662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn-config-jkbl6" event={"ID":"1f49e0f8-0661-49bb-a6c0-30365a1398f6","Type":"ContainerStarted","Data":"b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a"} Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.600574 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mcfsn-config-jkbl6" podStartSLOduration=2.600553155 podStartE2EDuration="2.600553155s" podCreationTimestamp="2026-03-08 05:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:01.591468651 +0000 UTC m=+1248.509117495" watchObservedRunningTime="2026-03-08 05:47:01.600553155 +0000 UTC m=+1248.518201999" Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.797558 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4zg55"] Mar 08 05:47:01 crc kubenswrapper[4717]: I0308 05:47:01.802725 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4zg55"] Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.113026 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-t4wh5" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.196344 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx5wn\" (UniqueName: \"kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn\") pod \"c576697c-a1be-47aa-bf8f-902435b2af04\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.196507 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts\") pod \"c576697c-a1be-47aa-bf8f-902435b2af04\" (UID: \"c576697c-a1be-47aa-bf8f-902435b2af04\") " Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.197047 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c576697c-a1be-47aa-bf8f-902435b2af04" (UID: "c576697c-a1be-47aa-bf8f-902435b2af04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.197316 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c576697c-a1be-47aa-bf8f-902435b2af04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.200468 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn" (OuterVolumeSpecName: "kube-api-access-cx5wn") pod "c576697c-a1be-47aa-bf8f-902435b2af04" (UID: "c576697c-a1be-47aa-bf8f-902435b2af04"). InnerVolumeSpecName "kube-api-access-cx5wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.298935 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx5wn\" (UniqueName: \"kubernetes.io/projected/c576697c-a1be-47aa-bf8f-902435b2af04-kube-api-access-cx5wn\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.579384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"b46a2c5477ca6d16831c41287814b95f22b13a3d766a311793c243db1e69862d"} Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.579721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"7cadb0de19bc964d21b1f96e8f6ce9af7ab3e824a6d822f3aa634ef886390076"} Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.581160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-t4wh5" event={"ID":"c576697c-a1be-47aa-bf8f-902435b2af04","Type":"ContainerDied","Data":"3e97cdd25b6adc36276683aa7aa476e519e3bd86e618cffe153b06b88f7d0767"} Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.581222 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e97cdd25b6adc36276683aa7aa476e519e3bd86e618cffe153b06b88f7d0767" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.581304 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-t4wh5" Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.583842 4717 generic.go:334] "Generic (PLEG): container finished" podID="1f49e0f8-0661-49bb-a6c0-30365a1398f6" containerID="9c3a930dcf3788ab3fffa14759e926c172c7fce0734dcf7a0de30f8e282b88d1" exitCode=0 Mar 08 05:47:02 crc kubenswrapper[4717]: I0308 05:47:02.583918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn-config-jkbl6" event={"ID":"1f49e0f8-0661-49bb-a6c0-30365a1398f6","Type":"ContainerDied","Data":"9c3a930dcf3788ab3fffa14759e926c172c7fce0734dcf7a0de30f8e282b88d1"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.168127 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.178490 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smbl2" Mar 08 05:47:03 crc kubenswrapper[4717]: E0308 05:47:03.289290 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a94056_9d2f_45ef_afa3_cf858787fc87.slice/crio-conmon-77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a94056_9d2f_45ef_afa3_cf858787fc87.slice/crio-77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.321651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4fq9\" (UniqueName: \"kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9\") pod \"2a07c895-406b-426a-b60a-545e0dced812\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.321831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c89br\" (UniqueName: \"kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br\") pod \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.321944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts\") pod \"2a07c895-406b-426a-b60a-545e0dced812\" (UID: \"2a07c895-406b-426a-b60a-545e0dced812\") " Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.322001 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts\") pod \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\" (UID: \"a6d77060-47d4-40ad-b1c1-5000a1513aaa\") " Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.323122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6d77060-47d4-40ad-b1c1-5000a1513aaa" (UID: "a6d77060-47d4-40ad-b1c1-5000a1513aaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.325071 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a07c895-406b-426a-b60a-545e0dced812" (UID: "2a07c895-406b-426a-b60a-545e0dced812"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.328796 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br" (OuterVolumeSpecName: "kube-api-access-c89br") pod "a6d77060-47d4-40ad-b1c1-5000a1513aaa" (UID: "a6d77060-47d4-40ad-b1c1-5000a1513aaa"). InnerVolumeSpecName "kube-api-access-c89br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.334879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9" (OuterVolumeSpecName: "kube-api-access-k4fq9") pod "2a07c895-406b-426a-b60a-545e0dced812" (UID: "2a07c895-406b-426a-b60a-545e0dced812"). InnerVolumeSpecName "kube-api-access-k4fq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.423642 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6d77060-47d4-40ad-b1c1-5000a1513aaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.423962 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4fq9\" (UniqueName: \"kubernetes.io/projected/2a07c895-406b-426a-b60a-545e0dced812-kube-api-access-k4fq9\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.423975 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c89br\" (UniqueName: \"kubernetes.io/projected/a6d77060-47d4-40ad-b1c1-5000a1513aaa-kube-api-access-c89br\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.423986 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a07c895-406b-426a-b60a-545e0dced812-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.592997 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec" containerID="7f449e2f2e084acd4fbdbe8981b6c42aa64820889a21b9690b7df2b6ba9a3e74" exitCode=0 Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.593090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec","Type":"ContainerDied","Data":"7f449e2f2e084acd4fbdbe8981b6c42aa64820889a21b9690b7df2b6ba9a3e74"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.600031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9f40-account-create-update-bhbfw" event={"ID":"2a07c895-406b-426a-b60a-545e0dced812","Type":"ContainerDied","Data":"e178d30e806a4fb5411409aeca5ada7bfa294be75c32713e13b28f46453fd743"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.600069 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e178d30e806a4fb5411409aeca5ada7bfa294be75c32713e13b28f46453fd743" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.600157 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9f40-account-create-update-bhbfw" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.622277 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"3f523b405639fe550480edd23d886589ae879e1b31df69fbf4fbd8f525de3c9d"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.622331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"98e90b60853afec7bb53917664c8fb8d3522cb8ad1e1feff8226b9b0d4aa1842"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.624415 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerID="77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e" exitCode=0 Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.624475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerDied","Data":"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.631969 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerStarted","Data":"5362172b23f2bff6327e97fd21c49c93ec075402e3bd2ee3ab189e145635183d"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.632004 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerStarted","Data":"42353326c814dc7623b5f7f00583d93616c6e372d17ca1899c8a5c1704ca9032"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.634227 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-smbl2" event={"ID":"a6d77060-47d4-40ad-b1c1-5000a1513aaa","Type":"ContainerDied","Data":"d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30"} Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.634259 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-smbl2" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.634266 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a9e5e46bddd2ed19e39925cf80ef73b04525bd3bc1ad7bf6b18ce5b97c1a30" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.692736 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.692717057 podStartE2EDuration="14.692717057s" podCreationTimestamp="2026-03-08 05:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:03.681736657 +0000 UTC m=+1250.599385501" watchObservedRunningTime="2026-03-08 05:47:03.692717057 +0000 UTC m=+1250.610365901" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.807997 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1355028-71db-4c23-8219-4401a80e2902" path="/var/lib/kubelet/pods/e1355028-71db-4c23-8219-4401a80e2902/volumes" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.881922 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-r5tmd"] Mar 08 05:47:03 crc kubenswrapper[4717]: E0308 05:47:03.882264 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d77060-47d4-40ad-b1c1-5000a1513aaa" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d77060-47d4-40ad-b1c1-5000a1513aaa" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: E0308 05:47:03.882298 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c576697c-a1be-47aa-bf8f-902435b2af04" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882305 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c576697c-a1be-47aa-bf8f-902435b2af04" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: E0308 05:47:03.882340 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a07c895-406b-426a-b60a-545e0dced812" containerName="mariadb-account-create-update" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882347 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a07c895-406b-426a-b60a-545e0dced812" containerName="mariadb-account-create-update" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882507 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c576697c-a1be-47aa-bf8f-902435b2af04" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882526 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a07c895-406b-426a-b60a-545e0dced812" containerName="mariadb-account-create-update" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.882536 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d77060-47d4-40ad-b1c1-5000a1513aaa" containerName="mariadb-database-create" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.886600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.888761 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vbk5p" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.889031 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.893631 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-r5tmd"] Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.933991 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.934068 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.934146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:03 crc kubenswrapper[4717]: I0308 05:47:03.934180 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpqz2\" (UniqueName: \"kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.034785 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.035318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.035431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.035470 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpqz2\" (UniqueName: \"kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.035514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.040345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.043833 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.051425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpqz2\" (UniqueName: \"kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.052042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data\") pod \"glance-db-sync-r5tmd\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136480 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136633 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcw75\" (UniqueName: \"kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.136736 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts\") pod \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\" (UID: \"1f49e0f8-0661-49bb-a6c0-30365a1398f6\") " Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.138348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts" (OuterVolumeSpecName: "scripts") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.138440 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run" (OuterVolumeSpecName: "var-run") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.138511 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.139247 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.140751 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.146871 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75" (OuterVolumeSpecName: "kube-api-access-qcw75") pod "1f49e0f8-0661-49bb-a6c0-30365a1398f6" (UID: "1f49e0f8-0661-49bb-a6c0-30365a1398f6"). InnerVolumeSpecName "kube-api-access-qcw75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.205499 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239004 4717 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239041 4717 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239054 4717 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239064 4717 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f49e0f8-0661-49bb-a6c0-30365a1398f6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239074 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcw75\" (UniqueName: \"kubernetes.io/projected/1f49e0f8-0661-49bb-a6c0-30365a1398f6-kube-api-access-qcw75\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.239083 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f49e0f8-0661-49bb-a6c0-30365a1398f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.645479 4717 generic.go:334] "Generic (PLEG): container finished" podID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerID="ae723f586b191c6db521638d50b64a0c103a18f55e7dfb6f19046bc98fd39696" exitCode=0 Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.646116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerDied","Data":"ae723f586b191c6db521638d50b64a0c103a18f55e7dfb6f19046bc98fd39696"} Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.663267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerStarted","Data":"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5"} Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.663500 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.702461 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mcfsn-config-jkbl6" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.703958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mcfsn-config-jkbl6" event={"ID":"1f49e0f8-0661-49bb-a6c0-30365a1398f6","Type":"ContainerDied","Data":"b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a"} Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.704033 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bdf57aa56c1adb95bdad2a44bef7d54e61f04c09da3df0a9e0ff3bee05ab2a" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.730897 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.584598515 podStartE2EDuration="1m31.730877759s" podCreationTimestamp="2026-03-08 05:45:33 +0000 UTC" firstStartedPulling="2026-03-08 05:45:35.881971068 +0000 UTC m=+1162.799619912" lastFinishedPulling="2026-03-08 05:46:29.028250312 +0000 UTC m=+1215.945899156" observedRunningTime="2026-03-08 05:47:04.726417049 +0000 UTC m=+1251.644065903" watchObservedRunningTime="2026-03-08 05:47:04.730877759 +0000 UTC m=+1251.648526603" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.736072 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec","Type":"ContainerStarted","Data":"122a956dad371024cbea1b1033d82d7b27c7d5d4622a0497d3c4a51f0dc6d29a"} Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.736469 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.802017 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=37.820176137 podStartE2EDuration="1m30.80199868s" podCreationTimestamp="2026-03-08 05:45:34 +0000 UTC" firstStartedPulling="2026-03-08 05:45:36.048239084 +0000 UTC m=+1162.965887928" lastFinishedPulling="2026-03-08 05:46:29.030061617 +0000 UTC m=+1215.947710471" observedRunningTime="2026-03-08 05:47:04.793725927 +0000 UTC m=+1251.711374771" watchObservedRunningTime="2026-03-08 05:47:04.80199868 +0000 UTC m=+1251.719647524" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.957801 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.957891 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.970142 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 05:47:04 crc kubenswrapper[4717]: I0308 05:47:04.974311 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 08 05:47:05 crc kubenswrapper[4717]: W0308 05:47:05.102302 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536f54f2_24ab_4b5d_a494_77d2464d03f9.slice/crio-719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e WatchSource:0}: Error finding container 719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e: Status 404 returned error can't find the container with id 719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.106484 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-r5tmd"] Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.137835 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mcfsn-config-jkbl6"] Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.147722 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mcfsn-config-jkbl6"] Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.754403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerStarted","Data":"c4fe89ad42bcf643e8f2825838c44b2ea73d780d43a4d90b6ca83118e44bcafe"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.756154 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.765961 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r5tmd" event={"ID":"536f54f2-24ab-4b5d-a494-77d2464d03f9","Type":"ContainerStarted","Data":"719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"90666fdee0da8691aed926f483156b829cf37346f69661ee7050e3f51cd17dae"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"cf85133d14d8e27815db9c92847c5977b94d3de69534c861d7650378e11366e0"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779574 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"d203f2fd252a00aa2eaa8a95d589e6a536837e1a44a1d32ce92f57448365047e"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779583 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"09feeafbb1b3b58f221ce276a6d4fda08a48e558eaceee06c08b46ca2167fa47"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"5245dca367d0560b241a1d7ebb2a3099bbba53f1176ff895137082a859cbc4d7"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.779599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"2c8ce988394335a2810e92aa6d2cc47026b46e13e03f33526beac80e1aaaa00a"} Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.795106 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371944.059685 podStartE2EDuration="1m32.795091182s" podCreationTimestamp="2026-03-08 05:45:33 +0000 UTC" firstStartedPulling="2026-03-08 05:45:35.808183315 +0000 UTC m=+1162.725832159" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:05.794141009 +0000 UTC m=+1252.711789853" watchObservedRunningTime="2026-03-08 05:47:05.795091182 +0000 UTC m=+1252.712740026" Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.795228 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f49e0f8-0661-49bb-a6c0-30365a1398f6" path="/var/lib/kubelet/pods/1f49e0f8-0661-49bb-a6c0-30365a1398f6/volumes" Mar 08 05:47:05 crc kubenswrapper[4717]: I0308 05:47:05.796287 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.793634 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-t5kvr"] Mar 08 05:47:06 crc kubenswrapper[4717]: E0308 05:47:06.794270 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f49e0f8-0661-49bb-a6c0-30365a1398f6" containerName="ovn-config" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.794282 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f49e0f8-0661-49bb-a6c0-30365a1398f6" containerName="ovn-config" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.794429 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f49e0f8-0661-49bb-a6c0-30365a1398f6" containerName="ovn-config" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.794962 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.801279 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.804767 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"67a11de8-b5e8-40d8-a451-1bece45918d8","Type":"ContainerStarted","Data":"ef93d26ce0080d217a81d0affeff61fa244cb91d37f36a8cba3240c1590ce927"} Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.804804 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-t5kvr"] Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.890166 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.418289008 podStartE2EDuration="26.890147595s" podCreationTimestamp="2026-03-08 05:46:40 +0000 UTC" firstStartedPulling="2026-03-08 05:46:59.036306569 +0000 UTC m=+1245.953955413" lastFinishedPulling="2026-03-08 05:47:04.508165156 +0000 UTC m=+1251.425814000" observedRunningTime="2026-03-08 05:47:06.888015382 +0000 UTC m=+1253.805664226" watchObservedRunningTime="2026-03-08 05:47:06.890147595 +0000 UTC m=+1253.807796439" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.897203 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.897338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q462t\" (UniqueName: \"kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.998968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.999109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q462t\" (UniqueName: \"kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:06 crc kubenswrapper[4717]: I0308 05:47:06.999654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.033194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q462t\" (UniqueName: \"kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t\") pod \"root-account-create-update-t5kvr\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.116052 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.262667 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.264837 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.268032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.273750 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.302802 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.302876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.302914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.302969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zvw\" (UniqueName: \"kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.303026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.303051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.404947 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.405008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.405056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zvw\" (UniqueName: \"kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.405097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.405132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.405167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.406404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.406942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.406958 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.407202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.407620 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.431670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zvw\" (UniqueName: \"kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw\") pod \"dnsmasq-dns-5bf4dd7b85-w8lmr\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.583701 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.628172 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-t5kvr"] Mar 08 05:47:07 crc kubenswrapper[4717]: I0308 05:47:07.819863 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-t5kvr" event={"ID":"bb3277c4-fd24-4629-8325-6abed267f270","Type":"ContainerStarted","Data":"1fa7d461d8188042e8767fe093e242ac66af950ef695a7dd6251d518edb0dc11"} Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.099653 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.830479 4717 generic.go:334] "Generic (PLEG): container finished" podID="bb3277c4-fd24-4629-8325-6abed267f270" containerID="c1c2fd3dc0c770759a543293e06d12b62d1bf3abd53b33e80ff7f1d9a482704b" exitCode=0 Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.830534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-t5kvr" event={"ID":"bb3277c4-fd24-4629-8325-6abed267f270","Type":"ContainerDied","Data":"c1c2fd3dc0c770759a543293e06d12b62d1bf3abd53b33e80ff7f1d9a482704b"} Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.836571 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerID="868d3357f9dd8d9db5a03de86cdd4c286b5649a56a4ff0d86e23bc3082e4bec7" exitCode=0 Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.836625 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" event={"ID":"9b2a46ff-0442-4a0e-b8da-7c24b20df3da","Type":"ContainerDied","Data":"868d3357f9dd8d9db5a03de86cdd4c286b5649a56a4ff0d86e23bc3082e4bec7"} Mar 08 05:47:08 crc kubenswrapper[4717]: I0308 05:47:08.836666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" event={"ID":"9b2a46ff-0442-4a0e-b8da-7c24b20df3da","Type":"ContainerStarted","Data":"65178bab378d5202ac4cda5b1fd2d15cd835d51debc976efc60c7782189590aa"} Mar 08 05:47:09 crc kubenswrapper[4717]: I0308 05:47:09.015941 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mcfsn" Mar 08 05:47:09 crc kubenswrapper[4717]: I0308 05:47:09.847736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" event={"ID":"9b2a46ff-0442-4a0e-b8da-7c24b20df3da","Type":"ContainerStarted","Data":"2a74ed5fd2bc4bf0d6a0c5137e41e1810d7da7df39de8f7139ead1105a730543"} Mar 08 05:47:09 crc kubenswrapper[4717]: I0308 05:47:09.885883 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podStartSLOduration=2.885865655 podStartE2EDuration="2.885865655s" podCreationTimestamp="2026-03-08 05:47:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:09.880053582 +0000 UTC m=+1256.797702436" watchObservedRunningTime="2026-03-08 05:47:09.885865655 +0000 UTC m=+1256.803514499" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.291434 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.453897 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q462t\" (UniqueName: \"kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t\") pod \"bb3277c4-fd24-4629-8325-6abed267f270\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.454394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts\") pod \"bb3277c4-fd24-4629-8325-6abed267f270\" (UID: \"bb3277c4-fd24-4629-8325-6abed267f270\") " Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.455104 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb3277c4-fd24-4629-8325-6abed267f270" (UID: "bb3277c4-fd24-4629-8325-6abed267f270"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.459589 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t" (OuterVolumeSpecName: "kube-api-access-q462t") pod "bb3277c4-fd24-4629-8325-6abed267f270" (UID: "bb3277c4-fd24-4629-8325-6abed267f270"). InnerVolumeSpecName "kube-api-access-q462t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.555974 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb3277c4-fd24-4629-8325-6abed267f270-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.556001 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q462t\" (UniqueName: \"kubernetes.io/projected/bb3277c4-fd24-4629-8325-6abed267f270-kube-api-access-q462t\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.857410 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-t5kvr" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.857424 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-t5kvr" event={"ID":"bb3277c4-fd24-4629-8325-6abed267f270","Type":"ContainerDied","Data":"1fa7d461d8188042e8767fe093e242ac66af950ef695a7dd6251d518edb0dc11"} Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.857512 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa7d461d8188042e8767fe093e242ac66af950ef695a7dd6251d518edb0dc11" Mar 08 05:47:10 crc kubenswrapper[4717]: I0308 05:47:10.857578 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:15 crc kubenswrapper[4717]: I0308 05:47:15.146566 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Mar 08 05:47:15 crc kubenswrapper[4717]: I0308 05:47:15.230935 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Mar 08 05:47:15 crc kubenswrapper[4717]: I0308 05:47:15.449703 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused" Mar 08 05:47:17 crc kubenswrapper[4717]: I0308 05:47:17.584911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:47:17 crc kubenswrapper[4717]: I0308 05:47:17.664033 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:47:17 crc kubenswrapper[4717]: I0308 05:47:17.664331 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="dnsmasq-dns" containerID="cri-o://937e5048d54aa262d500d2a6e10bc62ace20ddcf53b37b8c957b4f4af650b273" gracePeriod=10 Mar 08 05:47:17 crc kubenswrapper[4717]: I0308 05:47:17.915763 4717 generic.go:334] "Generic (PLEG): container finished" podID="3406523d-3819-494f-9270-c6ad58910d30" containerID="937e5048d54aa262d500d2a6e10bc62ace20ddcf53b37b8c957b4f4af650b273" exitCode=0 Mar 08 05:47:17 crc kubenswrapper[4717]: I0308 05:47:17.915802 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" event={"ID":"3406523d-3819-494f-9270-c6ad58910d30","Type":"ContainerDied","Data":"937e5048d54aa262d500d2a6e10bc62ace20ddcf53b37b8c957b4f4af650b273"} Mar 08 05:47:21 crc kubenswrapper[4717]: I0308 05:47:21.190794 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.603403 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.704232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config\") pod \"3406523d-3819-494f-9270-c6ad58910d30\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.704297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb\") pod \"3406523d-3819-494f-9270-c6ad58910d30\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.704353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6lhw\" (UniqueName: \"kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw\") pod \"3406523d-3819-494f-9270-c6ad58910d30\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.704447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb\") pod \"3406523d-3819-494f-9270-c6ad58910d30\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.704521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc\") pod \"3406523d-3819-494f-9270-c6ad58910d30\" (UID: \"3406523d-3819-494f-9270-c6ad58910d30\") " Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.716612 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw" (OuterVolumeSpecName: "kube-api-access-p6lhw") pod "3406523d-3819-494f-9270-c6ad58910d30" (UID: "3406523d-3819-494f-9270-c6ad58910d30"). InnerVolumeSpecName "kube-api-access-p6lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.759461 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3406523d-3819-494f-9270-c6ad58910d30" (UID: "3406523d-3819-494f-9270-c6ad58910d30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.761255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config" (OuterVolumeSpecName: "config") pod "3406523d-3819-494f-9270-c6ad58910d30" (UID: "3406523d-3819-494f-9270-c6ad58910d30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.762981 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3406523d-3819-494f-9270-c6ad58910d30" (UID: "3406523d-3819-494f-9270-c6ad58910d30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.777477 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3406523d-3819-494f-9270-c6ad58910d30" (UID: "3406523d-3819-494f-9270-c6ad58910d30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.806554 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.806585 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6lhw\" (UniqueName: \"kubernetes.io/projected/3406523d-3819-494f-9270-c6ad58910d30-kube-api-access-p6lhw\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.806595 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.806604 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.806613 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3406523d-3819-494f-9270-c6ad58910d30-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.964599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" event={"ID":"3406523d-3819-494f-9270-c6ad58910d30","Type":"ContainerDied","Data":"3e80510e75069e00c491f33e3f91b189c95cbdb60f3054e0800ace965f0d7419"} Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.964640 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bf9dcd95-lbcd4" Mar 08 05:47:23 crc kubenswrapper[4717]: I0308 05:47:23.964663 4717 scope.go:117] "RemoveContainer" containerID="937e5048d54aa262d500d2a6e10bc62ace20ddcf53b37b8c957b4f4af650b273" Mar 08 05:47:24 crc kubenswrapper[4717]: I0308 05:47:24.003431 4717 scope.go:117] "RemoveContainer" containerID="cdb0801cf44d2814be0de99ef023dba75db4f26bbfd0eddb9895bd7ef5fd56f1" Mar 08 05:47:24 crc kubenswrapper[4717]: I0308 05:47:24.005766 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:47:24 crc kubenswrapper[4717]: I0308 05:47:24.012257 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bf9dcd95-lbcd4"] Mar 08 05:47:24 crc kubenswrapper[4717]: I0308 05:47:24.545774 4717 scope.go:117] "RemoveContainer" containerID="bb87281bca193672a9b4398d82ec10abf47e1e32da4d7bfd427efc62a5f7fd3e" Mar 08 05:47:24 crc kubenswrapper[4717]: I0308 05:47:24.988988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r5tmd" event={"ID":"536f54f2-24ab-4b5d-a494-77d2464d03f9","Type":"ContainerStarted","Data":"3943fe4796494594cdb3420ef1d779da4c4fb9883a27fe3efae364b447849b8f"} Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.024709 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-r5tmd" podStartSLOduration=3.715977036 podStartE2EDuration="22.02465873s" podCreationTimestamp="2026-03-08 05:47:03 +0000 UTC" firstStartedPulling="2026-03-08 05:47:05.11107016 +0000 UTC m=+1252.028719004" lastFinishedPulling="2026-03-08 05:47:23.419751844 +0000 UTC m=+1270.337400698" observedRunningTime="2026-03-08 05:47:25.012589904 +0000 UTC m=+1271.930238788" watchObservedRunningTime="2026-03-08 05:47:25.02465873 +0000 UTC m=+1271.942307604" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.146866 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.233156 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.446442 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701270 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xvl9h"] Mar 08 05:47:25 crc kubenswrapper[4717]: E0308 05:47:25.701595 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="init" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701610 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="init" Mar 08 05:47:25 crc kubenswrapper[4717]: E0308 05:47:25.701629 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="dnsmasq-dns" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701635 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="dnsmasq-dns" Mar 08 05:47:25 crc kubenswrapper[4717]: E0308 05:47:25.701645 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3277c4-fd24-4629-8325-6abed267f270" containerName="mariadb-account-create-update" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701652 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3277c4-fd24-4629-8325-6abed267f270" containerName="mariadb-account-create-update" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701824 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3277c4-fd24-4629-8325-6abed267f270" containerName="mariadb-account-create-update" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.701842 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3406523d-3819-494f-9270-c6ad58910d30" containerName="dnsmasq-dns" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.702387 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.727657 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xvl9h"] Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.746367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fw9\" (UniqueName: \"kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.746469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.792998 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3406523d-3819-494f-9270-c6ad58910d30" path="/var/lib/kubelet/pods/3406523d-3819-494f-9270-c6ad58910d30/volumes" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.828324 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-217a-account-create-update-pplnn"] Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.829558 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.835177 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.847758 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fw9\" (UniqueName: \"kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.847853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.849404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.866303 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-217a-account-create-update-pplnn"] Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.897973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fw9\" (UniqueName: \"kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9\") pod \"cinder-db-create-xvl9h\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.949872 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt5vx\" (UniqueName: \"kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:25 crc kubenswrapper[4717]: I0308 05:47:25.949938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.001257 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wvdfv"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.003040 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.015125 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1e73-account-create-update-dmhcb"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.016155 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.023817 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.039573 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.051954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twsj7\" (UniqueName: \"kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.052005 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.052078 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt5vx\" (UniqueName: \"kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.052111 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2qb\" (UniqueName: \"kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.052135 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.052153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.054081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.075396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt5vx\" (UniqueName: \"kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx\") pod \"cinder-217a-account-create-update-pplnn\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.077570 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-h5xmg"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.079027 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.090648 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.095014 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.095449 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z9bhb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.095996 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.120778 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h5xmg"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.149071 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.153743 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2qb\" (UniqueName: \"kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.153855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.153881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9fc\" (UniqueName: \"kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.153988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.154182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twsj7\" (UniqueName: \"kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.154221 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.154255 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.154521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.155146 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.166954 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wvdfv"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.184763 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1e73-account-create-update-dmhcb"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.192260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2qb\" (UniqueName: \"kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb\") pod \"barbican-1e73-account-create-update-dmhcb\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.215383 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twsj7\" (UniqueName: \"kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7\") pod \"barbican-db-create-wvdfv\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.256798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.256892 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9fc\" (UniqueName: \"kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.256954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.260930 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.261203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.274533 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9fc\" (UniqueName: \"kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc\") pod \"keystone-db-sync-h5xmg\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.316770 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.341399 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.535341 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.676805 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-217a-account-create-update-pplnn"] Mar 08 05:47:26 crc kubenswrapper[4717]: W0308 05:47:26.680464 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod081ac052_0d8e_40f7_8618_7786fece62b3.slice/crio-9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c WatchSource:0}: Error finding container 9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c: Status 404 returned error can't find the container with id 9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.738863 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xvl9h"] Mar 08 05:47:26 crc kubenswrapper[4717]: W0308 05:47:26.767863 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8602ca4e_9467_4ad0_a725_c129f29bbedf.slice/crio-6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4 WatchSource:0}: Error finding container 6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4: Status 404 returned error can't find the container with id 6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4 Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.848514 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1e73-account-create-update-dmhcb"] Mar 08 05:47:26 crc kubenswrapper[4717]: I0308 05:47:26.956628 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wvdfv"] Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.004861 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvdfv" event={"ID":"c01a3977-851c-4679-b543-a7f550c8ec53","Type":"ContainerStarted","Data":"f7821ebf002473aaa40df6fcf8a3d21f33c2b1918994560bb3bed257a911e522"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.005922 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e73-account-create-update-dmhcb" event={"ID":"bd06b81e-1f5a-4f94-9b81-b24889dc8154","Type":"ContainerStarted","Data":"f976994bf19041bfd204727f87952306f2bc36402af02032c2db2509ad219857"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.008428 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-217a-account-create-update-pplnn" event={"ID":"081ac052-0d8e-40f7-8618-7786fece62b3","Type":"ContainerStarted","Data":"88116863db703bb3b571378b6c7e55e0f340f27f38cfb2c370b14f9a8a90c895"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.008535 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-217a-account-create-update-pplnn" event={"ID":"081ac052-0d8e-40f7-8618-7786fece62b3","Type":"ContainerStarted","Data":"9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.014312 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xvl9h" event={"ID":"8602ca4e-9467-4ad0-a725-c129f29bbedf","Type":"ContainerStarted","Data":"083eec1c0f86dee19866dc6eaf00858722d74cab29a43e6d88f14412ea08ca92"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.014349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xvl9h" event={"ID":"8602ca4e-9467-4ad0-a725-c129f29bbedf","Type":"ContainerStarted","Data":"6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4"} Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.030790 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-217a-account-create-update-pplnn" podStartSLOduration=2.030774315 podStartE2EDuration="2.030774315s" podCreationTimestamp="2026-03-08 05:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:27.028394876 +0000 UTC m=+1273.946043720" watchObservedRunningTime="2026-03-08 05:47:27.030774315 +0000 UTC m=+1273.948423159" Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.061282 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-xvl9h" podStartSLOduration=2.061259856 podStartE2EDuration="2.061259856s" podCreationTimestamp="2026-03-08 05:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:27.048711397 +0000 UTC m=+1273.966360241" watchObservedRunningTime="2026-03-08 05:47:27.061259856 +0000 UTC m=+1273.978908700" Mar 08 05:47:27 crc kubenswrapper[4717]: I0308 05:47:27.081758 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h5xmg"] Mar 08 05:47:27 crc kubenswrapper[4717]: W0308 05:47:27.106791 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9da7adfb_c194_4c69_af19_99e2ff00dbfa.slice/crio-5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f WatchSource:0}: Error finding container 5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f: Status 404 returned error can't find the container with id 5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.023399 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h5xmg" event={"ID":"9da7adfb-c194-4c69-af19-99e2ff00dbfa","Type":"ContainerStarted","Data":"5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f"} Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.025327 4717 generic.go:334] "Generic (PLEG): container finished" podID="c01a3977-851c-4679-b543-a7f550c8ec53" containerID="b63971cf580c1772314687faac888cade14410bc46fd71b3900fadc9caf9252f" exitCode=0 Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.025405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvdfv" event={"ID":"c01a3977-851c-4679-b543-a7f550c8ec53","Type":"ContainerDied","Data":"b63971cf580c1772314687faac888cade14410bc46fd71b3900fadc9caf9252f"} Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.026736 4717 generic.go:334] "Generic (PLEG): container finished" podID="bd06b81e-1f5a-4f94-9b81-b24889dc8154" containerID="38df43df38f0283dc78351ea63e00fded75b4291888b55bd10fe1cf656f105dc" exitCode=0 Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.026798 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e73-account-create-update-dmhcb" event={"ID":"bd06b81e-1f5a-4f94-9b81-b24889dc8154","Type":"ContainerDied","Data":"38df43df38f0283dc78351ea63e00fded75b4291888b55bd10fe1cf656f105dc"} Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.029047 4717 generic.go:334] "Generic (PLEG): container finished" podID="081ac052-0d8e-40f7-8618-7786fece62b3" containerID="88116863db703bb3b571378b6c7e55e0f340f27f38cfb2c370b14f9a8a90c895" exitCode=0 Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.029090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-217a-account-create-update-pplnn" event={"ID":"081ac052-0d8e-40f7-8618-7786fece62b3","Type":"ContainerDied","Data":"88116863db703bb3b571378b6c7e55e0f340f27f38cfb2c370b14f9a8a90c895"} Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.031098 4717 generic.go:334] "Generic (PLEG): container finished" podID="8602ca4e-9467-4ad0-a725-c129f29bbedf" containerID="083eec1c0f86dee19866dc6eaf00858722d74cab29a43e6d88f14412ea08ca92" exitCode=0 Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.031128 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xvl9h" event={"ID":"8602ca4e-9467-4ad0-a725-c129f29bbedf","Type":"ContainerDied","Data":"083eec1c0f86dee19866dc6eaf00858722d74cab29a43e6d88f14412ea08ca92"} Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.521551 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-m97sm"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.523170 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.525549 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4bgvm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.526203 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.532067 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-m97sm"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.580159 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sr87b"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.581343 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.591347 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sr87b"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.620825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.620934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.621068 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.621121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klqd2\" (UniqueName: \"kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.674787 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a0cf-account-create-update-skd47"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.675910 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.679361 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.683436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a0cf-account-create-update-skd47"] Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.722968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723043 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klqd2\" (UniqueName: \"kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723203 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723231 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qm2\" (UniqueName: \"kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.723290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlbt\" (UniqueName: \"kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.728390 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.728647 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.745225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.746428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klqd2\" (UniqueName: \"kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2\") pod \"watcher-db-sync-m97sm\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.825691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlbt\" (UniqueName: \"kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.825884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.826763 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.826853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qm2\" (UniqueName: \"kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.827175 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.827723 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.841975 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvlbt\" (UniqueName: \"kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt\") pod \"neutron-a0cf-account-create-update-skd47\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.843099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qm2\" (UniqueName: \"kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2\") pod \"neutron-db-create-sr87b\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.847570 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.898990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:28 crc kubenswrapper[4717]: I0308 05:47:28.995382 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.330079 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-m97sm"] Mar 08 05:47:29 crc kubenswrapper[4717]: W0308 05:47:29.394579 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90593a29_3f8c_4228_8c82_a183a4e33054.slice/crio-aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4 WatchSource:0}: Error finding container aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4: Status 404 returned error can't find the container with id aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4 Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.508713 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.514219 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.521115 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.537753 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.643743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sr87b"] Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644182 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv2qb\" (UniqueName: \"kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb\") pod \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644257 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7fw9\" (UniqueName: \"kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9\") pod \"8602ca4e-9467-4ad0-a725-c129f29bbedf\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644281 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts\") pod \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\" (UID: \"bd06b81e-1f5a-4f94-9b81-b24889dc8154\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts\") pod \"081ac052-0d8e-40f7-8618-7786fece62b3\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644354 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt5vx\" (UniqueName: \"kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx\") pod \"081ac052-0d8e-40f7-8618-7786fece62b3\" (UID: \"081ac052-0d8e-40f7-8618-7786fece62b3\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts\") pod \"8602ca4e-9467-4ad0-a725-c129f29bbedf\" (UID: \"8602ca4e-9467-4ad0-a725-c129f29bbedf\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twsj7\" (UniqueName: \"kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7\") pod \"c01a3977-851c-4679-b543-a7f550c8ec53\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.644547 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts\") pod \"c01a3977-851c-4679-b543-a7f550c8ec53\" (UID: \"c01a3977-851c-4679-b543-a7f550c8ec53\") " Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.645283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c01a3977-851c-4679-b543-a7f550c8ec53" (UID: "c01a3977-851c-4679-b543-a7f550c8ec53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.645301 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8602ca4e-9467-4ad0-a725-c129f29bbedf" (UID: "8602ca4e-9467-4ad0-a725-c129f29bbedf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.645333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd06b81e-1f5a-4f94-9b81-b24889dc8154" (UID: "bd06b81e-1f5a-4f94-9b81-b24889dc8154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.645516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "081ac052-0d8e-40f7-8618-7786fece62b3" (UID: "081ac052-0d8e-40f7-8618-7786fece62b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.650175 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7" (OuterVolumeSpecName: "kube-api-access-twsj7") pod "c01a3977-851c-4679-b543-a7f550c8ec53" (UID: "c01a3977-851c-4679-b543-a7f550c8ec53"). InnerVolumeSpecName "kube-api-access-twsj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.650977 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb" (OuterVolumeSpecName: "kube-api-access-qv2qb") pod "bd06b81e-1f5a-4f94-9b81-b24889dc8154" (UID: "bd06b81e-1f5a-4f94-9b81-b24889dc8154"). InnerVolumeSpecName "kube-api-access-qv2qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.651219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9" (OuterVolumeSpecName: "kube-api-access-d7fw9") pod "8602ca4e-9467-4ad0-a725-c129f29bbedf" (UID: "8602ca4e-9467-4ad0-a725-c129f29bbedf"). InnerVolumeSpecName "kube-api-access-d7fw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.651637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx" (OuterVolumeSpecName: "kube-api-access-tt5vx") pod "081ac052-0d8e-40f7-8618-7786fece62b3" (UID: "081ac052-0d8e-40f7-8618-7786fece62b3"). InnerVolumeSpecName "kube-api-access-tt5vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.709202 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a0cf-account-create-update-skd47"] Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746832 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8602ca4e-9467-4ad0-a725-c129f29bbedf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746881 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twsj7\" (UniqueName: \"kubernetes.io/projected/c01a3977-851c-4679-b543-a7f550c8ec53-kube-api-access-twsj7\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746892 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c01a3977-851c-4679-b543-a7f550c8ec53-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746901 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv2qb\" (UniqueName: \"kubernetes.io/projected/bd06b81e-1f5a-4f94-9b81-b24889dc8154-kube-api-access-qv2qb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746911 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7fw9\" (UniqueName: \"kubernetes.io/projected/8602ca4e-9467-4ad0-a725-c129f29bbedf-kube-api-access-d7fw9\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746941 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd06b81e-1f5a-4f94-9b81-b24889dc8154-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746952 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/081ac052-0d8e-40f7-8618-7786fece62b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:29 crc kubenswrapper[4717]: I0308 05:47:29.746960 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt5vx\" (UniqueName: \"kubernetes.io/projected/081ac052-0d8e-40f7-8618-7786fece62b3-kube-api-access-tt5vx\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.086643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a0cf-account-create-update-skd47" event={"ID":"cc72f330-6146-4017-adde-5a63a6cffdb4","Type":"ContainerStarted","Data":"638fb144780a760acef193ddb2040a3dffb74bb1fd6d741e6aa474c94a8f4ac6"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.086988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a0cf-account-create-update-skd47" event={"ID":"cc72f330-6146-4017-adde-5a63a6cffdb4","Type":"ContainerStarted","Data":"f8e59c2829e98a2b857ce3dec2b5e784742950d10069d134e40f56a4ea6ffc89"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.089062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvdfv" event={"ID":"c01a3977-851c-4679-b543-a7f550c8ec53","Type":"ContainerDied","Data":"f7821ebf002473aaa40df6fcf8a3d21f33c2b1918994560bb3bed257a911e522"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.089094 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7821ebf002473aaa40df6fcf8a3d21f33c2b1918994560bb3bed257a911e522" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.089152 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvdfv" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.092268 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1e73-account-create-update-dmhcb" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.092442 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1e73-account-create-update-dmhcb" event={"ID":"bd06b81e-1f5a-4f94-9b81-b24889dc8154","Type":"ContainerDied","Data":"f976994bf19041bfd204727f87952306f2bc36402af02032c2db2509ad219857"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.092479 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f976994bf19041bfd204727f87952306f2bc36402af02032c2db2509ad219857" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.096750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-217a-account-create-update-pplnn" event={"ID":"081ac052-0d8e-40f7-8618-7786fece62b3","Type":"ContainerDied","Data":"9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.096772 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b02f3f799888050baab22d6403895357d195eb18b6b2a523394ccf75881414c" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.096772 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-217a-account-create-update-pplnn" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.099309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sr87b" event={"ID":"44abc9b7-3edc-42a0-bcf0-8b07efcec67f","Type":"ContainerStarted","Data":"1885c81c3e71bf433e55afd291b620b96f295a75aae4aa9b087adb55005bea90"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.099357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sr87b" event={"ID":"44abc9b7-3edc-42a0-bcf0-8b07efcec67f","Type":"ContainerStarted","Data":"253bca1039615b1fb0b1f1294bad476b5f4ec2e8cc16568bfd5553141eddd797"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.105509 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-a0cf-account-create-update-skd47" podStartSLOduration=2.10548791 podStartE2EDuration="2.10548791s" podCreationTimestamp="2026-03-08 05:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:30.104635649 +0000 UTC m=+1277.022284493" watchObservedRunningTime="2026-03-08 05:47:30.10548791 +0000 UTC m=+1277.023136754" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.106157 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xvl9h" event={"ID":"8602ca4e-9467-4ad0-a725-c129f29bbedf","Type":"ContainerDied","Data":"6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.106196 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xvl9h" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.106207 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6af8e4b14f1aa80d344da2443694b5ff5deef35d84e8006c0b87ebd73917d8f4" Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.108358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-m97sm" event={"ID":"90593a29-3f8c-4228-8c82-a183a4e33054","Type":"ContainerStarted","Data":"aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4"} Mar 08 05:47:30 crc kubenswrapper[4717]: I0308 05:47:30.126040 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-sr87b" podStartSLOduration=2.126017856 podStartE2EDuration="2.126017856s" podCreationTimestamp="2026-03-08 05:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:30.118602563 +0000 UTC m=+1277.036251407" watchObservedRunningTime="2026-03-08 05:47:30.126017856 +0000 UTC m=+1277.043666700" Mar 08 05:47:31 crc kubenswrapper[4717]: I0308 05:47:31.131708 4717 generic.go:334] "Generic (PLEG): container finished" podID="44abc9b7-3edc-42a0-bcf0-8b07efcec67f" containerID="1885c81c3e71bf433e55afd291b620b96f295a75aae4aa9b087adb55005bea90" exitCode=0 Mar 08 05:47:31 crc kubenswrapper[4717]: I0308 05:47:31.131768 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sr87b" event={"ID":"44abc9b7-3edc-42a0-bcf0-8b07efcec67f","Type":"ContainerDied","Data":"1885c81c3e71bf433e55afd291b620b96f295a75aae4aa9b087adb55005bea90"} Mar 08 05:47:31 crc kubenswrapper[4717]: I0308 05:47:31.133805 4717 generic.go:334] "Generic (PLEG): container finished" podID="cc72f330-6146-4017-adde-5a63a6cffdb4" containerID="638fb144780a760acef193ddb2040a3dffb74bb1fd6d741e6aa474c94a8f4ac6" exitCode=0 Mar 08 05:47:31 crc kubenswrapper[4717]: I0308 05:47:31.133860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a0cf-account-create-update-skd47" event={"ID":"cc72f330-6146-4017-adde-5a63a6cffdb4","Type":"ContainerDied","Data":"638fb144780a760acef193ddb2040a3dffb74bb1fd6d741e6aa474c94a8f4ac6"} Mar 08 05:47:32 crc kubenswrapper[4717]: I0308 05:47:32.144737 4717 generic.go:334] "Generic (PLEG): container finished" podID="536f54f2-24ab-4b5d-a494-77d2464d03f9" containerID="3943fe4796494594cdb3420ef1d779da4c4fb9883a27fe3efae364b447849b8f" exitCode=0 Mar 08 05:47:32 crc kubenswrapper[4717]: I0308 05:47:32.144842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r5tmd" event={"ID":"536f54f2-24ab-4b5d-a494-77d2464d03f9","Type":"ContainerDied","Data":"3943fe4796494594cdb3420ef1d779da4c4fb9883a27fe3efae364b447849b8f"} Mar 08 05:47:32 crc kubenswrapper[4717]: I0308 05:47:32.938147 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:32 crc kubenswrapper[4717]: I0308 05:47:32.951621 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.045319 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts\") pod \"cc72f330-6146-4017-adde-5a63a6cffdb4\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.045447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts\") pod \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.045599 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvlbt\" (UniqueName: \"kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt\") pod \"cc72f330-6146-4017-adde-5a63a6cffdb4\" (UID: \"cc72f330-6146-4017-adde-5a63a6cffdb4\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.045766 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qm2\" (UniqueName: \"kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2\") pod \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\" (UID: \"44abc9b7-3edc-42a0-bcf0-8b07efcec67f\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.046497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44abc9b7-3edc-42a0-bcf0-8b07efcec67f" (UID: "44abc9b7-3edc-42a0-bcf0-8b07efcec67f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.046634 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc72f330-6146-4017-adde-5a63a6cffdb4" (UID: "cc72f330-6146-4017-adde-5a63a6cffdb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.049971 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2" (OuterVolumeSpecName: "kube-api-access-d9qm2") pod "44abc9b7-3edc-42a0-bcf0-8b07efcec67f" (UID: "44abc9b7-3edc-42a0-bcf0-8b07efcec67f"). InnerVolumeSpecName "kube-api-access-d9qm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.050705 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt" (OuterVolumeSpecName: "kube-api-access-tvlbt") pod "cc72f330-6146-4017-adde-5a63a6cffdb4" (UID: "cc72f330-6146-4017-adde-5a63a6cffdb4"). InnerVolumeSpecName "kube-api-access-tvlbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.148063 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvlbt\" (UniqueName: \"kubernetes.io/projected/cc72f330-6146-4017-adde-5a63a6cffdb4-kube-api-access-tvlbt\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.148090 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qm2\" (UniqueName: \"kubernetes.io/projected/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-kube-api-access-d9qm2\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.148099 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc72f330-6146-4017-adde-5a63a6cffdb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.148107 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44abc9b7-3edc-42a0-bcf0-8b07efcec67f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.161479 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sr87b" event={"ID":"44abc9b7-3edc-42a0-bcf0-8b07efcec67f","Type":"ContainerDied","Data":"253bca1039615b1fb0b1f1294bad476b5f4ec2e8cc16568bfd5553141eddd797"} Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.161530 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253bca1039615b1fb0b1f1294bad476b5f4ec2e8cc16568bfd5553141eddd797" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.161535 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sr87b" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.163877 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a0cf-account-create-update-skd47" event={"ID":"cc72f330-6146-4017-adde-5a63a6cffdb4","Type":"ContainerDied","Data":"f8e59c2829e98a2b857ce3dec2b5e784742950d10069d134e40f56a4ea6ffc89"} Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.163932 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e59c2829e98a2b857ce3dec2b5e784742950d10069d134e40f56a4ea6ffc89" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.164007 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a0cf-account-create-update-skd47" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.167273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h5xmg" event={"ID":"9da7adfb-c194-4c69-af19-99e2ff00dbfa","Type":"ContainerStarted","Data":"35d68b7105c54ff4a19e15560804238b3d22a51fa077f704faf2030cb4a39663"} Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.187672 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-h5xmg" podStartSLOduration=1.577828565 podStartE2EDuration="7.187652519s" podCreationTimestamp="2026-03-08 05:47:26 +0000 UTC" firstStartedPulling="2026-03-08 05:47:27.112358294 +0000 UTC m=+1274.030007138" lastFinishedPulling="2026-03-08 05:47:32.722182248 +0000 UTC m=+1279.639831092" observedRunningTime="2026-03-08 05:47:33.182891041 +0000 UTC m=+1280.100539905" watchObservedRunningTime="2026-03-08 05:47:33.187652519 +0000 UTC m=+1280.105301373" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.641231 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.760998 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data\") pod \"536f54f2-24ab-4b5d-a494-77d2464d03f9\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.761162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data\") pod \"536f54f2-24ab-4b5d-a494-77d2464d03f9\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.762423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpqz2\" (UniqueName: \"kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2\") pod \"536f54f2-24ab-4b5d-a494-77d2464d03f9\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.762544 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle\") pod \"536f54f2-24ab-4b5d-a494-77d2464d03f9\" (UID: \"536f54f2-24ab-4b5d-a494-77d2464d03f9\") " Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.766443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "536f54f2-24ab-4b5d-a494-77d2464d03f9" (UID: "536f54f2-24ab-4b5d-a494-77d2464d03f9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.766443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2" (OuterVolumeSpecName: "kube-api-access-kpqz2") pod "536f54f2-24ab-4b5d-a494-77d2464d03f9" (UID: "536f54f2-24ab-4b5d-a494-77d2464d03f9"). InnerVolumeSpecName "kube-api-access-kpqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.797904 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "536f54f2-24ab-4b5d-a494-77d2464d03f9" (UID: "536f54f2-24ab-4b5d-a494-77d2464d03f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.818263 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data" (OuterVolumeSpecName: "config-data") pod "536f54f2-24ab-4b5d-a494-77d2464d03f9" (UID: "536f54f2-24ab-4b5d-a494-77d2464d03f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.864360 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpqz2\" (UniqueName: \"kubernetes.io/projected/536f54f2-24ab-4b5d-a494-77d2464d03f9-kube-api-access-kpqz2\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.864385 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.864397 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:33 crc kubenswrapper[4717]: I0308 05:47:33.864405 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/536f54f2-24ab-4b5d-a494-77d2464d03f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.177860 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-r5tmd" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.189389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-r5tmd" event={"ID":"536f54f2-24ab-4b5d-a494-77d2464d03f9","Type":"ContainerDied","Data":"719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e"} Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.189455 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719035cb4497073d1d4c690fe39ce781fab5ae66eefed529ea5bc413a4546b3e" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.591638 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592119 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8602ca4e-9467-4ad0-a725-c129f29bbedf" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592134 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8602ca4e-9467-4ad0-a725-c129f29bbedf" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592149 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44abc9b7-3edc-42a0-bcf0-8b07efcec67f" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592156 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44abc9b7-3edc-42a0-bcf0-8b07efcec67f" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592173 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081ac052-0d8e-40f7-8618-7786fece62b3" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592182 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="081ac052-0d8e-40f7-8618-7786fece62b3" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592193 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536f54f2-24ab-4b5d-a494-77d2464d03f9" containerName="glance-db-sync" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592201 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f54f2-24ab-4b5d-a494-77d2464d03f9" containerName="glance-db-sync" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592215 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06b81e-1f5a-4f94-9b81-b24889dc8154" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592223 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06b81e-1f5a-4f94-9b81-b24889dc8154" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592234 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc72f330-6146-4017-adde-5a63a6cffdb4" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592242 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc72f330-6146-4017-adde-5a63a6cffdb4" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: E0308 05:47:34.592259 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01a3977-851c-4679-b543-a7f550c8ec53" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592267 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01a3977-851c-4679-b543-a7f550c8ec53" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592443 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01a3977-851c-4679-b543-a7f550c8ec53" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592497 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc72f330-6146-4017-adde-5a63a6cffdb4" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592520 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="536f54f2-24ab-4b5d-a494-77d2464d03f9" containerName="glance-db-sync" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592540 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="44abc9b7-3edc-42a0-bcf0-8b07efcec67f" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592559 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8602ca4e-9467-4ad0-a725-c129f29bbedf" containerName="mariadb-database-create" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592572 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="081ac052-0d8e-40f7-8618-7786fece62b3" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.592593 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd06b81e-1f5a-4f94-9b81-b24889dc8154" containerName="mariadb-account-create-update" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.594030 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.615546 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679714 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxff\" (UniqueName: \"kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679752 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679779 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.679834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.781872 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.782189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxff\" (UniqueName: \"kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.782210 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.782261 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.782295 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.782355 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.783018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.783145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.783379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.783580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.783735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.801173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxff\" (UniqueName: \"kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff\") pod \"dnsmasq-dns-84ccd66db5-zqj6j\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:34 crc kubenswrapper[4717]: I0308 05:47:34.923601 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:38 crc kubenswrapper[4717]: I0308 05:47:38.218335 4717 generic.go:334] "Generic (PLEG): container finished" podID="9da7adfb-c194-4c69-af19-99e2ff00dbfa" containerID="35d68b7105c54ff4a19e15560804238b3d22a51fa077f704faf2030cb4a39663" exitCode=0 Mar 08 05:47:38 crc kubenswrapper[4717]: I0308 05:47:38.218414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h5xmg" event={"ID":"9da7adfb-c194-4c69-af19-99e2ff00dbfa","Type":"ContainerDied","Data":"35d68b7105c54ff4a19e15560804238b3d22a51fa077f704faf2030cb4a39663"} Mar 08 05:47:38 crc kubenswrapper[4717]: I0308 05:47:38.812499 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:38 crc kubenswrapper[4717]: W0308 05:47:38.814621 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13b845f_ecdb_40c3_9ece_11fce0e6421a.slice/crio-99a627e4d199935b57495dda9f3d0a580dd87e1a8cbbc52330dd14b0e373478d WatchSource:0}: Error finding container 99a627e4d199935b57495dda9f3d0a580dd87e1a8cbbc52330dd14b0e373478d: Status 404 returned error can't find the container with id 99a627e4d199935b57495dda9f3d0a580dd87e1a8cbbc52330dd14b0e373478d Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.229597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-m97sm" event={"ID":"90593a29-3f8c-4228-8c82-a183a4e33054","Type":"ContainerStarted","Data":"2ecc3c5b09cedad03e8d0a14eaa8eaaf53a5377ab3d674d50a3decf8ef90d3b1"} Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.232180 4717 generic.go:334] "Generic (PLEG): container finished" podID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerID="3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc" exitCode=0 Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.232296 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" event={"ID":"a13b845f-ecdb-40c3-9ece-11fce0e6421a","Type":"ContainerDied","Data":"3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc"} Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.232361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" event={"ID":"a13b845f-ecdb-40c3-9ece-11fce0e6421a","Type":"ContainerStarted","Data":"99a627e4d199935b57495dda9f3d0a580dd87e1a8cbbc52330dd14b0e373478d"} Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.270027 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-m97sm" podStartSLOduration=2.076502956 podStartE2EDuration="11.270007097s" podCreationTimestamp="2026-03-08 05:47:28 +0000 UTC" firstStartedPulling="2026-03-08 05:47:29.403136957 +0000 UTC m=+1276.320785801" lastFinishedPulling="2026-03-08 05:47:38.596641088 +0000 UTC m=+1285.514289942" observedRunningTime="2026-03-08 05:47:39.247230877 +0000 UTC m=+1286.164879731" watchObservedRunningTime="2026-03-08 05:47:39.270007097 +0000 UTC m=+1286.187655951" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.698735 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.778541 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle\") pod \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.778672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data\") pod \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.778769 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns9fc\" (UniqueName: \"kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc\") pod \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\" (UID: \"9da7adfb-c194-4c69-af19-99e2ff00dbfa\") " Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.794129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc" (OuterVolumeSpecName: "kube-api-access-ns9fc") pod "9da7adfb-c194-4c69-af19-99e2ff00dbfa" (UID: "9da7adfb-c194-4c69-af19-99e2ff00dbfa"). InnerVolumeSpecName "kube-api-access-ns9fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.810197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9da7adfb-c194-4c69-af19-99e2ff00dbfa" (UID: "9da7adfb-c194-4c69-af19-99e2ff00dbfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.837211 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data" (OuterVolumeSpecName: "config-data") pod "9da7adfb-c194-4c69-af19-99e2ff00dbfa" (UID: "9da7adfb-c194-4c69-af19-99e2ff00dbfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.881524 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.881562 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da7adfb-c194-4c69-af19-99e2ff00dbfa-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:39 crc kubenswrapper[4717]: I0308 05:47:39.881571 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns9fc\" (UniqueName: \"kubernetes.io/projected/9da7adfb-c194-4c69-af19-99e2ff00dbfa-kube-api-access-ns9fc\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.244319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" event={"ID":"a13b845f-ecdb-40c3-9ece-11fce0e6421a","Type":"ContainerStarted","Data":"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb"} Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.244653 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.248932 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h5xmg" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.253937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h5xmg" event={"ID":"9da7adfb-c194-4c69-af19-99e2ff00dbfa","Type":"ContainerDied","Data":"5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f"} Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.253996 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a24d531a489824859015819a97768c4b6e6b61f65506e7dc4921e0d2189262f" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.293687 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" podStartSLOduration=6.293662341 podStartE2EDuration="6.293662341s" podCreationTimestamp="2026-03-08 05:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:40.291816546 +0000 UTC m=+1287.209465490" watchObservedRunningTime="2026-03-08 05:47:40.293662341 +0000 UTC m=+1287.211311195" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.533151 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.543943 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-drf74"] Mar 08 05:47:40 crc kubenswrapper[4717]: E0308 05:47:40.544435 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da7adfb-c194-4c69-af19-99e2ff00dbfa" containerName="keystone-db-sync" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.544452 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da7adfb-c194-4c69-af19-99e2ff00dbfa" containerName="keystone-db-sync" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.544619 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da7adfb-c194-4c69-af19-99e2ff00dbfa" containerName="keystone-db-sync" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.545273 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.553138 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z9bhb" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.553403 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.553596 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.553784 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.554032 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.598802 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-drf74"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.618176 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.620220 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.658783 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.700908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgcw\" (UniqueName: \"kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701000 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701064 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sth27\" (UniqueName: \"kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701249 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701277 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.701336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.718189 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-htmd2"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.732359 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.749226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.749423 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xh2tz" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.749532 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.755076 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.756421 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.762056 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f89fw" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.762266 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.762394 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.762479 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.785370 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-htmd2"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.802471 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803337 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgcw\" (UniqueName: \"kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803500 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w24m\" (UniqueName: \"kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803521 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803559 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803615 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803656 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803680 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sth27\" (UniqueName: \"kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.803700 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.804564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.805138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.814074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.814938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.818332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.819563 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.826056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.826273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.841181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sth27\" (UniqueName: \"kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.855115 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgcw\" (UniqueName: \"kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw\") pod \"dnsmasq-dns-5c965dbf49-dfpr6\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.869203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.874777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys\") pod \"keystone-bootstrap-drf74\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.881985 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-drf74" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.906929 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.907534 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908560 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgk8g\" (UniqueName: \"kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908675 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w24m\" (UniqueName: \"kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.908736 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.936447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.944528 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.957795 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.959837 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wvxkm"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.960889 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.964140 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.971098 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.971447 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.971617 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-762pl" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.971755 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.972369 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.976410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w24m\" (UniqueName: \"kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m\") pod \"neutron-db-sync-htmd2\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.980771 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-66q5h"] Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.981787 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.987063 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.987135 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 05:47:40 crc kubenswrapper[4717]: I0308 05:47:40.987247 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qhn9b" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.006200 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rlvrs"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.008179 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011321 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgk8g\" (UniqueName: \"kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbszg\" (UniqueName: \"kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fclx\" (UniqueName: \"kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011529 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011857 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.011935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.012298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.013380 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.013475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.020349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.024818 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-swvz4" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.025368 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.035241 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wvxkm"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.047228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgk8g\" (UniqueName: \"kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g\") pod \"horizon-685d956dff-kcqnq\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.055368 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.068522 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-htmd2" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.085070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-66q5h"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.085597 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.109884 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.115920 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.122181 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rlvrs"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124420 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124902 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.124679 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzn7\" (UniqueName: \"kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbszg\" (UniqueName: \"kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fclx\" (UniqueName: \"kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqkp\" (UniqueName: \"kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125473 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125720 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125923 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.125972 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.126030 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.126068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.140703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.140969 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.142844 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.145088 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.146158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.146467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.149568 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.154028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.154050 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.154390 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.160003 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fclx\" (UniqueName: \"kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx\") pod \"placement-db-sync-wvxkm\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.161958 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbszg\" (UniqueName: \"kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg\") pod \"ceilometer-0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.166525 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.167938 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.176563 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.207777 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.209195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.216107 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.221830 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.221991 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.222181 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vbk5p" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228885 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghrb5\" (UniqueName: \"kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzn7\" (UniqueName: \"kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.228997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229014 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqkp\" (UniqueName: \"kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229055 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229087 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdx6\" (UniqueName: \"kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.229264 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.230934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.233603 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.234548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.238265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.238311 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.238781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.239577 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.252819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.259985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqkp\" (UniqueName: \"kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp\") pod \"barbican-db-sync-rlvrs\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.262475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzn7\" (UniqueName: \"kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7\") pod \"cinder-db-sync-66q5h\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.321453 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330573 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330659 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hjl\" (UniqueName: \"kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330750 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330783 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghrb5\" (UniqueName: \"kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330811 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330843 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330876 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdx6\" (UniqueName: \"kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.330996 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.331019 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.331043 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.331065 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.331087 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.332372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.332614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.333460 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.334040 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.334157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.334314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.334552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.335442 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.337498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.353566 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wvxkm" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.354447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66q5h" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.355473 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdx6\" (UniqueName: \"kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6\") pod \"dnsmasq-dns-5555fccc9f-gxdtl\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.358646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghrb5\" (UniqueName: \"kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5\") pod \"horizon-556b65bf97-x8664\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.389784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433753 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433838 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433859 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hjl\" (UniqueName: \"kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.433881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.434402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.438749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.439198 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.439862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.439915 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.447526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.447981 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.456351 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.459160 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hjl\" (UniqueName: \"kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.483204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.528119 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.554781 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.602271 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.658304 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-drf74"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.755176 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.769577 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-htmd2"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.881595 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.891509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.909884 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.909937 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.942922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.953771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954011 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954101 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954183 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9nm\" (UniqueName: \"kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954300 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954461 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:41 crc kubenswrapper[4717]: I0308 05:47:41.954555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9nm\" (UniqueName: \"kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056570 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056597 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.056854 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.057073 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.057080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.061924 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.069959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.071206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.072463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.077983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9nm\" (UniqueName: \"kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.115897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.217125 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:47:42 crc kubenswrapper[4717]: W0308 05:47:42.310856 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec6c6686_44c7_49ec_950b_7054d96e207d.slice/crio-6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf WatchSource:0}: Error finding container 6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf: Status 404 returned error can't find the container with id 6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.318555 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wvxkm"] Mar 08 05:47:42 crc kubenswrapper[4717]: W0308 05:47:42.325699 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc84338_48ac_4538_b134_5993d5a9f91c.slice/crio-50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb WatchSource:0}: Error finding container 50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb: Status 404 returned error can't find the container with id 50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.327591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-htmd2" event={"ID":"818627ad-f6eb-43d2-adfc-7daacc7f9b6f","Type":"ContainerStarted","Data":"ed51d405b2bc45be9abcfa817172cb6ff7c1fd2dcc3a0ada30ccd6707e23e39b"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.327671 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-htmd2" event={"ID":"818627ad-f6eb-43d2-adfc-7daacc7f9b6f","Type":"ContainerStarted","Data":"ad179de5658abc4d3bc7e1dd85d8b9d92ac86e7c830aa870559efc402947430b"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.333070 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-drf74" event={"ID":"02c8cd6e-8f2d-4c20-9fbf-335c620a898e","Type":"ContainerStarted","Data":"9d026ad3aa94009a1b17e448af4616c4bfb226b1a1ccc50684106646ea69f8c6"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.333140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-drf74" event={"ID":"02c8cd6e-8f2d-4c20-9fbf-335c620a898e","Type":"ContainerStarted","Data":"9b67928ca6357ef937bbfe60d7145550c4c15874fbf4662c0a5481460f26703d"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.369479 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" podUID="d5e16c30-bd40-46c7-ab56-185a8551aebe" containerName="init" containerID="cri-o://fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e" gracePeriod=10 Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.370086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" event={"ID":"d5e16c30-bd40-46c7-ab56-185a8551aebe","Type":"ContainerStarted","Data":"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.371915 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" event={"ID":"d5e16c30-bd40-46c7-ab56-185a8551aebe","Type":"ContainerStarted","Data":"7c12f1c55eb47bdd8275674a5048a93880b23d0193d86446ce41fa3a44ae1d05"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.377817 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rlvrs"] Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.380901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-685d956dff-kcqnq" event={"ID":"fc7fb4bf-d5ea-4ede-9d60-22786afec81d","Type":"ContainerStarted","Data":"55088ea055960e978e4c6e21e0da66a3fc4d84d642155686a4387e374ed0d75b"} Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.380975 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="dnsmasq-dns" containerID="cri-o://48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb" gracePeriod=10 Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.392094 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-66q5h"] Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.423501 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-drf74" podStartSLOduration=2.423477282 podStartE2EDuration="2.423477282s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:42.378241948 +0000 UTC m=+1289.295890792" watchObservedRunningTime="2026-03-08 05:47:42.423477282 +0000 UTC m=+1289.341126116" Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.433160 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.641393 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.730525 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:47:42 crc kubenswrapper[4717]: I0308 05:47:42.844212 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.069388 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.210117 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.225179 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.250723 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:47:43 crc kubenswrapper[4717]: E0308 05:47:43.251199 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e16c30-bd40-46c7-ab56-185a8551aebe" containerName="init" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.251242 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e16c30-bd40-46c7-ab56-185a8551aebe" containerName="init" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.251434 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e16c30-bd40-46c7-ab56-185a8551aebe" containerName="init" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.252527 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.285013 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314519 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314579 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314679 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.314867 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjgcw\" (UniqueName: \"kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw\") pod \"d5e16c30-bd40-46c7-ab56-185a8551aebe\" (UID: \"d5e16c30-bd40-46c7-ab56-185a8551aebe\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.315102 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.315187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.315204 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.317951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.318001 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdv6r\" (UniqueName: \"kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.334654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw" (OuterVolumeSpecName: "kube-api-access-mjgcw") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "kube-api-access-mjgcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.412841 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423391 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423444 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdv6r\" (UniqueName: \"kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423617 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.423724 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjgcw\" (UniqueName: \"kubernetes.io/projected/d5e16c30-bd40-46c7-ab56-185a8551aebe-kube-api-access-mjgcw\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.424944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.425357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.425557 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.441188 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.442354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.442824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.443061 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.447364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.448869 4717 generic.go:334] "Generic (PLEG): container finished" podID="d5e16c30-bd40-46c7-ab56-185a8551aebe" containerID="fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e" exitCode=0 Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.448967 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.449081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdv6r\" (UniqueName: \"kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r\") pod \"horizon-8fbc4b46c-9rg6p\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.448977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" event={"ID":"d5e16c30-bd40-46c7-ab56-185a8551aebe","Type":"ContainerDied","Data":"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.449148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c965dbf49-dfpr6" event={"ID":"d5e16c30-bd40-46c7-ab56-185a8551aebe","Type":"ContainerDied","Data":"7c12f1c55eb47bdd8275674a5048a93880b23d0193d86446ce41fa3a44ae1d05"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.449174 4717 scope.go:117] "RemoveContainer" containerID="fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.449202 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config" (OuterVolumeSpecName: "config") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.460651 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.465454 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5e16c30-bd40-46c7-ab56-185a8551aebe" (UID: "d5e16c30-bd40-46c7-ab56-185a8551aebe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.474947 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.493871 4717 scope.go:117] "RemoveContainer" containerID="fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.494191 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerStarted","Data":"ac6eb310008bc6a20628caff5ecb63d45a26cf6dfa1f6415c1c705f250ce1691"} Mar 08 05:47:43 crc kubenswrapper[4717]: E0308 05:47:43.498884 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e\": container with ID starting with fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e not found: ID does not exist" containerID="fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.498921 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e"} err="failed to get container status \"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e\": rpc error: code = NotFound desc = could not find container \"fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e\": container with ID starting with fcfacbba3fbdd564f57142e0d0bed658aaa1a6a47882355cddb8c8bf3fe37f9e not found: ID does not exist" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.510878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" event={"ID":"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10","Type":"ContainerStarted","Data":"87d29f1eb8804f06c6632b306457130ee09bacc49454aa5b95367a2b5d1f03f1"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528328 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxff\" (UniqueName: \"kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528371 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528443 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528486 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb\") pod \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\" (UID: \"a13b845f-ecdb-40c3-9ece-11fce0e6421a\") " Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528967 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528987 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.528999 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.529011 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.529024 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5e16c30-bd40-46c7-ab56-185a8551aebe-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.539178 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerStarted","Data":"7b6968d24906ec835c28df8bc180c2fa1d6d4e27211e05aee4c689c2ac906a6f"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.561653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66q5h" event={"ID":"ec6c6686-44c7-49ec-950b-7054d96e207d","Type":"ContainerStarted","Data":"6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.563617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerStarted","Data":"082ff64934e71f4212e48d288296a7ce39dd2fc6b93019bd202f2a9da584a659"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.564689 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-556b65bf97-x8664" event={"ID":"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0","Type":"ContainerStarted","Data":"e98042916b9c3156f8aacd514be6b9ab929d2ccd043b2f4303d6bc42a46f400d"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.565979 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff" (OuterVolumeSpecName: "kube-api-access-msxff") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "kube-api-access-msxff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.571401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wvxkm" event={"ID":"ffc84338-48ac-4538-b134-5993d5a9f91c","Type":"ContainerStarted","Data":"50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.574848 4717 generic.go:334] "Generic (PLEG): container finished" podID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerID="48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb" exitCode=0 Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.574912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" event={"ID":"a13b845f-ecdb-40c3-9ece-11fce0e6421a","Type":"ContainerDied","Data":"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.574929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" event={"ID":"a13b845f-ecdb-40c3-9ece-11fce0e6421a","Type":"ContainerDied","Data":"99a627e4d199935b57495dda9f3d0a580dd87e1a8cbbc52330dd14b0e373478d"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.574945 4717 scope.go:117] "RemoveContainer" containerID="48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.575035 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ccd66db5-zqj6j" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.588989 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rlvrs" event={"ID":"6c734bf7-1916-4a47-93e0-42caaaced812","Type":"ContainerStarted","Data":"cfac0f7783e1d7179f3cacfa158c065d54546d7a248bca6a00e3edd10d38e709"} Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.608048 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.617864 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.630534 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxff\" (UniqueName: \"kubernetes.io/projected/a13b845f-ecdb-40c3-9ece-11fce0e6421a-kube-api-access-msxff\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.630563 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.636374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config" (OuterVolumeSpecName: "config") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.637052 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-htmd2" podStartSLOduration=3.637027541 podStartE2EDuration="3.637027541s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:43.609020021 +0000 UTC m=+1290.526668865" watchObservedRunningTime="2026-03-08 05:47:43.637027541 +0000 UTC m=+1290.554676375" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.648249 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.659182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.667598 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a13b845f-ecdb-40c3-9ece-11fce0e6421a" (UID: "a13b845f-ecdb-40c3-9ece-11fce0e6421a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.731733 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.731766 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.731777 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.731786 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13b845f-ecdb-40c3-9ece-11fce0e6421a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.833390 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.836242 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c965dbf49-dfpr6"] Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.914016 4717 scope.go:117] "RemoveContainer" containerID="3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc" Mar 08 05:47:43 crc kubenswrapper[4717]: I0308 05:47:43.985213 4717 scope.go:117] "RemoveContainer" containerID="48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb" Mar 08 05:47:44 crc kubenswrapper[4717]: E0308 05:47:44.011872 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb\": container with ID starting with 48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb not found: ID does not exist" containerID="48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb" Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.011913 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb"} err="failed to get container status \"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb\": rpc error: code = NotFound desc = could not find container \"48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb\": container with ID starting with 48aafc4ef186f92ad99d9bb4628c6b3841bf1463ea065787e96939d486e1dacb not found: ID does not exist" Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.011943 4717 scope.go:117] "RemoveContainer" containerID="3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc" Mar 08 05:47:44 crc kubenswrapper[4717]: E0308 05:47:44.015049 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc\": container with ID starting with 3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc not found: ID does not exist" containerID="3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc" Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.015097 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc"} err="failed to get container status \"3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc\": rpc error: code = NotFound desc = could not find container \"3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc\": container with ID starting with 3174e6158840b7e33111c4476146fca6b654b8385d4413e5f33f0566f2ca91dc not found: ID does not exist" Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.109873 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.119528 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84ccd66db5-zqj6j"] Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.336713 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.601261 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerStarted","Data":"c66da218ef4dcfd150c269a841cdaf78fa977264bfea9a42b75c0f73aebc2824"} Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.604988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8fbc4b46c-9rg6p" event={"ID":"a5d8149f-3efd-47b3-a228-bd47a3bf2073","Type":"ContainerStarted","Data":"b883748a24317a4a8e9ac7dec3308982ad042b52c4a9ac24f37830a0627d58d6"} Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.612546 4717 generic.go:334] "Generic (PLEG): container finished" podID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerID="4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb" exitCode=0 Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.612595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" event={"ID":"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10","Type":"ContainerDied","Data":"4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb"} Mar 08 05:47:44 crc kubenswrapper[4717]: I0308 05:47:44.619465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerStarted","Data":"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5"} Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.762095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" event={"ID":"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10","Type":"ContainerStarted","Data":"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90"} Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.762911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.767438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerStarted","Data":"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c"} Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.768250 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-httpd" containerID="cri-o://6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" gracePeriod=30 Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.774420 4717 generic.go:334] "Generic (PLEG): container finished" podID="90593a29-3f8c-4228-8c82-a183a4e33054" containerID="2ecc3c5b09cedad03e8d0a14eaa8eaaf53a5377ab3d674d50a3decf8ef90d3b1" exitCode=0 Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.774459 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-m97sm" event={"ID":"90593a29-3f8c-4228-8c82-a183a4e33054","Type":"ContainerDied","Data":"2ecc3c5b09cedad03e8d0a14eaa8eaaf53a5377ab3d674d50a3decf8ef90d3b1"} Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.767807 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-log" containerID="cri-o://b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" gracePeriod=30 Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.793782 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" podStartSLOduration=5.793763255 podStartE2EDuration="5.793763255s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:45.789738555 +0000 UTC m=+1292.707387399" watchObservedRunningTime="2026-03-08 05:47:45.793763255 +0000 UTC m=+1292.711412099" Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.801883 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" path="/var/lib/kubelet/pods/a13b845f-ecdb-40c3-9ece-11fce0e6421a/volumes" Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.802723 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e16c30-bd40-46c7-ab56-185a8551aebe" path="/var/lib/kubelet/pods/d5e16c30-bd40-46c7-ab56-185a8551aebe/volumes" Mar 08 05:47:45 crc kubenswrapper[4717]: I0308 05:47:45.814417 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.814401193 podStartE2EDuration="4.814401193s" podCreationTimestamp="2026-03-08 05:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:45.810424935 +0000 UTC m=+1292.728073779" watchObservedRunningTime="2026-03-08 05:47:45.814401193 +0000 UTC m=+1292.732050037" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.536711 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659076 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659207 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659228 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7hjl\" (UniqueName: \"kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659256 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659277 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659310 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data\") pod \"c978385f-590d-403b-8f7d-74ea245dc9ca\" (UID: \"c978385f-590d-403b-8f7d-74ea245dc9ca\") " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659638 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs" (OuterVolumeSpecName: "logs") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.659741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.660165 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.660183 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c978385f-590d-403b-8f7d-74ea245dc9ca-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.670837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl" (OuterVolumeSpecName: "kube-api-access-w7hjl") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "kube-api-access-w7hjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.670855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.683664 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts" (OuterVolumeSpecName: "scripts") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.706057 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.732823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.738721 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data" (OuterVolumeSpecName: "config-data") pod "c978385f-590d-403b-8f7d-74ea245dc9ca" (UID: "c978385f-590d-403b-8f7d-74ea245dc9ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762295 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7hjl\" (UniqueName: \"kubernetes.io/projected/c978385f-590d-403b-8f7d-74ea245dc9ca-kube-api-access-w7hjl\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762342 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762353 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762362 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762371 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.762379 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c978385f-590d-403b-8f7d-74ea245dc9ca-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.780695 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794563 4717 generic.go:334] "Generic (PLEG): container finished" podID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerID="6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" exitCode=143 Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794588 4717 generic.go:334] "Generic (PLEG): container finished" podID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerID="b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" exitCode=143 Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerDied","Data":"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c"} Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerDied","Data":"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5"} Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c978385f-590d-403b-8f7d-74ea245dc9ca","Type":"ContainerDied","Data":"7b6968d24906ec835c28df8bc180c2fa1d6d4e27211e05aee4c689c2ac906a6f"} Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794674 4717 scope.go:117] "RemoveContainer" containerID="6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.794819 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.800671 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-log" containerID="cri-o://c66da218ef4dcfd150c269a841cdaf78fa977264bfea9a42b75c0f73aebc2824" gracePeriod=30 Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.800904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerStarted","Data":"027e1cd56a577cf8e15a05dc5f68938811b1fb8b11346d437ee869cfbd1124af"} Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.801250 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-httpd" containerID="cri-o://027e1cd56a577cf8e15a05dc5f68938811b1fb8b11346d437ee869cfbd1124af" gracePeriod=30 Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.828591 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.828568483 podStartE2EDuration="6.828568483s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:47:46.823732734 +0000 UTC m=+1293.741381578" watchObservedRunningTime="2026-03-08 05:47:46.828568483 +0000 UTC m=+1293.746217327" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.840318 4717 scope.go:117] "RemoveContainer" containerID="b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.864460 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.865651 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.880276 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.905531 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:46 crc kubenswrapper[4717]: E0308 05:47:46.905932 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-httpd" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.905943 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-httpd" Mar 08 05:47:46 crc kubenswrapper[4717]: E0308 05:47:46.905968 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="dnsmasq-dns" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.905974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="dnsmasq-dns" Mar 08 05:47:46 crc kubenswrapper[4717]: E0308 05:47:46.905986 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="init" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.905992 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="init" Mar 08 05:47:46 crc kubenswrapper[4717]: E0308 05:47:46.906010 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-log" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.906017 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-log" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.906212 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-httpd" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.906222 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b845f-ecdb-40c3-9ece-11fce0e6421a" containerName="dnsmasq-dns" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.906240 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" containerName="glance-log" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.907268 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.910157 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.912298 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 05:47:46 crc kubenswrapper[4717]: I0308 05:47:46.926732 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068412 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n45h\" (UniqueName: \"kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068649 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068671 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.068724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.171541 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.171888 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n45h\" (UniqueName: \"kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.171944 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.171972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.172467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.172497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.172600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.172628 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.173300 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.173399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.177471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.183195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.185328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.190150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n45h\" (UniqueName: \"kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.191609 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.207066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.209423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.281234 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.384156 4717 scope.go:117] "RemoveContainer" containerID="6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" Mar 08 05:47:47 crc kubenswrapper[4717]: E0308 05:47:47.385035 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c\": container with ID starting with 6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c not found: ID does not exist" containerID="6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.385067 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c"} err="failed to get container status \"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c\": rpc error: code = NotFound desc = could not find container \"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c\": container with ID starting with 6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c not found: ID does not exist" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.385084 4717 scope.go:117] "RemoveContainer" containerID="b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" Mar 08 05:47:47 crc kubenswrapper[4717]: E0308 05:47:47.385384 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5\": container with ID starting with b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5 not found: ID does not exist" containerID="b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.385405 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5"} err="failed to get container status \"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5\": rpc error: code = NotFound desc = could not find container \"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5\": container with ID starting with b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5 not found: ID does not exist" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.385416 4717 scope.go:117] "RemoveContainer" containerID="6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.388230 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c"} err="failed to get container status \"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c\": rpc error: code = NotFound desc = could not find container \"6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c\": container with ID starting with 6283f083fdb7aa3149c35794df373429510a8d03a4c449aef83c8417aea7466c not found: ID does not exist" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.388250 4717 scope.go:117] "RemoveContainer" containerID="b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.388539 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5"} err="failed to get container status \"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5\": rpc error: code = NotFound desc = could not find container \"b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5\": container with ID starting with b685a09c444e63d46719575d71496891ff3a80f2995a57ceddcb33e2c003cba5 not found: ID does not exist" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.470966 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.586437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klqd2\" (UniqueName: \"kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2\") pod \"90593a29-3f8c-4228-8c82-a183a4e33054\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.586880 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle\") pod \"90593a29-3f8c-4228-8c82-a183a4e33054\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.586931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data\") pod \"90593a29-3f8c-4228-8c82-a183a4e33054\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.587039 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data\") pod \"90593a29-3f8c-4228-8c82-a183a4e33054\" (UID: \"90593a29-3f8c-4228-8c82-a183a4e33054\") " Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.592996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "90593a29-3f8c-4228-8c82-a183a4e33054" (UID: "90593a29-3f8c-4228-8c82-a183a4e33054"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.593060 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2" (OuterVolumeSpecName: "kube-api-access-klqd2") pod "90593a29-3f8c-4228-8c82-a183a4e33054" (UID: "90593a29-3f8c-4228-8c82-a183a4e33054"). InnerVolumeSpecName "kube-api-access-klqd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.610282 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90593a29-3f8c-4228-8c82-a183a4e33054" (UID: "90593a29-3f8c-4228-8c82-a183a4e33054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.635522 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data" (OuterVolumeSpecName: "config-data") pod "90593a29-3f8c-4228-8c82-a183a4e33054" (UID: "90593a29-3f8c-4228-8c82-a183a4e33054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.690342 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.690374 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klqd2\" (UniqueName: \"kubernetes.io/projected/90593a29-3f8c-4228-8c82-a183a4e33054-kube-api-access-klqd2\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.690385 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.690394 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90593a29-3f8c-4228-8c82-a183a4e33054-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.802217 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c978385f-590d-403b-8f7d-74ea245dc9ca" path="/var/lib/kubelet/pods/c978385f-590d-403b-8f7d-74ea245dc9ca/volumes" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.812774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-m97sm" event={"ID":"90593a29-3f8c-4228-8c82-a183a4e33054","Type":"ContainerDied","Data":"aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4"} Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.812803 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed1934d902787f3474bcb89c851d02f6d998b85092e05a66e04798235a857f4" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.812848 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-m97sm" Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.828145 4717 generic.go:334] "Generic (PLEG): container finished" podID="ce22e317-1f45-443f-b495-0c9b297dc721" containerID="c66da218ef4dcfd150c269a841cdaf78fa977264bfea9a42b75c0f73aebc2824" exitCode=143 Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.828179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerDied","Data":"c66da218ef4dcfd150c269a841cdaf78fa977264bfea9a42b75c0f73aebc2824"} Mar 08 05:47:47 crc kubenswrapper[4717]: I0308 05:47:47.971620 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:47 crc kubenswrapper[4717]: W0308 05:47:47.981881 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8132755_0c53_4feb_80d9_1e86d52d0ea8.slice/crio-e3e985bcbceae0c9e92b492d0723d87e9693ee7de0b8b15c3a5665f2ad8db155 WatchSource:0}: Error finding container e3e985bcbceae0c9e92b492d0723d87e9693ee7de0b8b15c3a5665f2ad8db155: Status 404 returned error can't find the container with id e3e985bcbceae0c9e92b492d0723d87e9693ee7de0b8b15c3a5665f2ad8db155 Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.085303 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: E0308 05:47:48.085648 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90593a29-3f8c-4228-8c82-a183a4e33054" containerName="watcher-db-sync" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.085663 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90593a29-3f8c-4228-8c82-a183a4e33054" containerName="watcher-db-sync" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.085848 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90593a29-3f8c-4228-8c82-a183a4e33054" containerName="watcher-db-sync" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.086383 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.089853 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4bgvm" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.090046 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.108062 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.199322 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.199434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.199470 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.199552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.199644 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbv64\" (UniqueName: \"kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.256389 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.257629 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.262604 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.277877 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.281338 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.287215 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.302795 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.303641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.303747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.303816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbv64\" (UniqueName: \"kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.303886 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.303909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.306112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.311574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.312643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.312990 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.321308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbv64\" (UniqueName: \"kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.329615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.405854 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.405922 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.405963 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6lk\" (UniqueName: \"kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406108 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plglf\" (UniqueName: \"kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.406273 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.479034 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6lk\" (UniqueName: \"kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513209 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513250 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plglf\" (UniqueName: \"kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513711 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.513812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.514097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.516370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.516738 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.517328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.518601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.533607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.533668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6lk\" (UniqueName: \"kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk\") pod \"watcher-applier-0\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.534428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plglf\" (UniqueName: \"kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf\") pod \"watcher-api-0\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.578096 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.605411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.840123 4717 generic.go:334] "Generic (PLEG): container finished" podID="ce22e317-1f45-443f-b495-0c9b297dc721" containerID="027e1cd56a577cf8e15a05dc5f68938811b1fb8b11346d437ee869cfbd1124af" exitCode=0 Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.840286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerDied","Data":"027e1cd56a577cf8e15a05dc5f68938811b1fb8b11346d437ee869cfbd1124af"} Mar 08 05:47:48 crc kubenswrapper[4717]: I0308 05:47:48.842096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerStarted","Data":"e3e985bcbceae0c9e92b492d0723d87e9693ee7de0b8b15c3a5665f2ad8db155"} Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.738263 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.768267 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.770438 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.774095 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.808827 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.808879 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.838501 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.857808 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69bcb664dd-nb94m"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.860422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.866933 4717 generic.go:334] "Generic (PLEG): container finished" podID="02c8cd6e-8f2d-4c20-9fbf-335c620a898e" containerID="9d026ad3aa94009a1b17e448af4616c4bfb226b1a1ccc50684106646ea69f8c6" exitCode=0 Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.866983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-drf74" event={"ID":"02c8cd6e-8f2d-4c20-9fbf-335c620a898e","Type":"ContainerDied","Data":"9d026ad3aa94009a1b17e448af4616c4bfb226b1a1ccc50684106646ea69f8c6"} Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.873846 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69bcb664dd-nb94m"] Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945631 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945752 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945771 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k2jb\" (UniqueName: \"kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-config-data\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945811 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945844 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5qw7\" (UniqueName: \"kubernetes.io/projected/9ab815c4-1b4d-499a-af69-f5e5907c9542-kube-api-access-z5qw7\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945868 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-secret-key\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.945890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab815c4-1b4d-499a-af69-f5e5907c9542-logs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946509 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-combined-ca-bundle\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-scripts\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:49 crc kubenswrapper[4717]: I0308 05:47:49.946665 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-tls-certs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.047939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-scripts\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.047982 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-tls-certs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048015 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k2jb\" (UniqueName: \"kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-config-data\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048085 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5qw7\" (UniqueName: \"kubernetes.io/projected/9ab815c4-1b4d-499a-af69-f5e5907c9542-kube-api-access-z5qw7\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-secret-key\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab815c4-1b4d-499a-af69-f5e5907c9542-logs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.048375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-combined-ca-bundle\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.050412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.050483 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.051206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.051601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab815c4-1b4d-499a-af69-f5e5907c9542-logs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.052490 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-scripts\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.052653 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ab815c4-1b4d-499a-af69-f5e5907c9542-config-data\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.054309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-combined-ca-bundle\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.056642 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.057781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.058152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-secret-key\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.058337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab815c4-1b4d-499a-af69-f5e5907c9542-horizon-tls-certs\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.069479 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k2jb\" (UniqueName: \"kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.072295 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs\") pod \"horizon-7cf759c7cb-qxb65\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.083760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5qw7\" (UniqueName: \"kubernetes.io/projected/9ab815c4-1b4d-499a-af69-f5e5907c9542-kube-api-access-z5qw7\") pod \"horizon-69bcb664dd-nb94m\" (UID: \"9ab815c4-1b4d-499a-af69-f5e5907c9542\") " pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.101236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:47:50 crc kubenswrapper[4717]: I0308 05:47:50.182029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:47:51 crc kubenswrapper[4717]: I0308 05:47:51.449946 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:47:51 crc kubenswrapper[4717]: I0308 05:47:51.541777 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:47:51 crc kubenswrapper[4717]: I0308 05:47:51.542052 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" containerID="cri-o://2a74ed5fd2bc4bf0d6a0c5137e41e1810d7da7df39de8f7139ead1105a730543" gracePeriod=10 Mar 08 05:47:51 crc kubenswrapper[4717]: I0308 05:47:51.888323 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerID="2a74ed5fd2bc4bf0d6a0c5137e41e1810d7da7df39de8f7139ead1105a730543" exitCode=0 Mar 08 05:47:51 crc kubenswrapper[4717]: I0308 05:47:51.888364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" event={"ID":"9b2a46ff-0442-4a0e-b8da-7c24b20df3da","Type":"ContainerDied","Data":"2a74ed5fd2bc4bf0d6a0c5137e41e1810d7da7df39de8f7139ead1105a730543"} Mar 08 05:47:52 crc kubenswrapper[4717]: I0308 05:47:52.585375 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 08 05:47:57 crc kubenswrapper[4717]: I0308 05:47:57.585042 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 08 05:47:59 crc kubenswrapper[4717]: E0308 05:47:59.724122 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:47:59 crc kubenswrapper[4717]: E0308 05:47:59.724633 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:47:59 crc kubenswrapper[4717]: E0308 05:47:59.724958 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n88h5dfh4hcfh5cbh5dfh677h5b9hf9h57bh7fh658hdbh97h85h64fhd6h654h5b6h5d6h58dhd4hbch5d6h5f9h597h79h96h657hcch59ch69q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgk8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-685d956dff-kcqnq_openstack(fc7fb4bf-d5ea-4ede-9d60-22786afec81d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:47:59 crc kubenswrapper[4717]: E0308 05:47:59.727274 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-horizon:current\\\"\"]" pod="openstack/horizon-685d956dff-kcqnq" podUID="fc7fb4bf-d5ea-4ede-9d60-22786afec81d" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.138752 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549148-nrst9"] Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.140129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.142559 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.146239 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.146423 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.151894 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549148-nrst9"] Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.289927 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5m4\" (UniqueName: \"kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4\") pod \"auto-csr-approver-29549148-nrst9\" (UID: \"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5\") " pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.392007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5m4\" (UniqueName: \"kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4\") pod \"auto-csr-approver-29549148-nrst9\" (UID: \"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5\") " pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.419376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5m4\" (UniqueName: \"kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4\") pod \"auto-csr-approver-29549148-nrst9\" (UID: \"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5\") " pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:00 crc kubenswrapper[4717]: I0308 05:48:00.487468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:02 crc kubenswrapper[4717]: I0308 05:48:02.585713 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 08 05:48:02 crc kubenswrapper[4717]: I0308 05:48:02.586069 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:48:07 crc kubenswrapper[4717]: I0308 05:48:07.584999 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.003405 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.003474 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.003620 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbhcbh55dh554h569h5f9h684h5ddh58bh55dhc5h575h99hf8h579h5ch5bch5bdh88h5ch574h54ch69hb6h576hfdh544h5b5h568h74h684h577q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ghrb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-556b65bf97-x8664_openstack(1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.009820 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-horizon:current\\\"\"]" pod="openstack/horizon-556b65bf97-x8664" podUID="1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.081350 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-drf74" event={"ID":"02c8cd6e-8f2d-4c20-9fbf-335c620a898e","Type":"ContainerDied","Data":"9b67928ca6357ef937bbfe60d7145550c4c15874fbf4662c0a5481460f26703d"} Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.081389 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b67928ca6357ef937bbfe60d7145550c4c15874fbf4662c0a5481460f26703d" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.085281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ce22e317-1f45-443f-b495-0c9b297dc721","Type":"ContainerDied","Data":"082ff64934e71f4212e48d288296a7ce39dd2fc6b93019bd202f2a9da584a659"} Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.085339 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="082ff64934e71f4212e48d288296a7ce39dd2fc6b93019bd202f2a9da584a659" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.088407 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.088456 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.088608 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h5c7h679h67dhd5h6h695h5cch5fbh5b7hcbhc7h674h54chb8hdfhf6h5d8h99hf4h5b9h558hd7hb7h54ch675h55ch678h5b9h5ffh5f6h55bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdv6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8fbc4b46c-9rg6p_openstack(a5d8149f-3efd-47b3-a228-bd47a3bf2073): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.091269 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-horizon:current\\\"\"]" pod="openstack/horizon-8fbc4b46c-9rg6p" podUID="a5d8149f-3efd-47b3-a228-bd47a3bf2073" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.139097 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.146818 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-drf74" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254452 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254584 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254665 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254730 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254752 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254785 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254863 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sth27\" (UniqueName: \"kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254907 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254931 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254964 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.254991 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255028 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq9nm\" (UniqueName: \"kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255062 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data\") pod \"ce22e317-1f45-443f-b495-0c9b297dc721\" (UID: \"ce22e317-1f45-443f-b495-0c9b297dc721\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255091 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys\") pod \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\" (UID: \"02c8cd6e-8f2d-4c20-9fbf-335c620a898e\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255398 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255827 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.255832 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs" (OuterVolumeSpecName: "logs") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.261574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts" (OuterVolumeSpecName: "scripts") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.261629 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27" (OuterVolumeSpecName: "kube-api-access-sth27") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "kube-api-access-sth27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.261834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.262158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.264388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm" (OuterVolumeSpecName: "kube-api-access-cq9nm") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "kube-api-access-cq9nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.265622 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.272027 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts" (OuterVolumeSpecName: "scripts") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.286278 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data" (OuterVolumeSpecName: "config-data") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.323221 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02c8cd6e-8f2d-4c20-9fbf-335c620a898e" (UID: "02c8cd6e-8f2d-4c20-9fbf-335c620a898e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.339189 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.348145 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357577 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357605 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357617 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sth27\" (UniqueName: \"kubernetes.io/projected/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-kube-api-access-sth27\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357627 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357653 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357661 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357669 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357817 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq9nm\" (UniqueName: \"kubernetes.io/projected/ce22e317-1f45-443f-b495-0c9b297dc721-kube-api-access-cq9nm\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357854 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357864 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357884 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02c8cd6e-8f2d-4c20-9fbf-335c620a898e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.357894 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce22e317-1f45-443f-b495-0c9b297dc721-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.368128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data" (OuterVolumeSpecName: "config-data") pod "ce22e317-1f45-443f-b495-0c9b297dc721" (UID: "ce22e317-1f45-443f-b495-0c9b297dc721"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.383389 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.463062 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.463107 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce22e317-1f45-443f-b495-0c9b297dc721-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.731946 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.732016 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.732162 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlqkp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rlvrs_openstack(6c734bf7-1916-4a47-93e0-42caaaced812): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:48:08 crc kubenswrapper[4717]: E0308 05:48:08.733356 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rlvrs" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.783702 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.872220 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgk8g\" (UniqueName: \"kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g\") pod \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.872296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs\") pod \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.872396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts\") pod \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.872416 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data\") pod \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.872440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key\") pod \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\" (UID: \"fc7fb4bf-d5ea-4ede-9d60-22786afec81d\") " Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.873129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs" (OuterVolumeSpecName: "logs") pod "fc7fb4bf-d5ea-4ede-9d60-22786afec81d" (UID: "fc7fb4bf-d5ea-4ede-9d60-22786afec81d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.875346 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts" (OuterVolumeSpecName: "scripts") pod "fc7fb4bf-d5ea-4ede-9d60-22786afec81d" (UID: "fc7fb4bf-d5ea-4ede-9d60-22786afec81d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.875595 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data" (OuterVolumeSpecName: "config-data") pod "fc7fb4bf-d5ea-4ede-9d60-22786afec81d" (UID: "fc7fb4bf-d5ea-4ede-9d60-22786afec81d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.876879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fc7fb4bf-d5ea-4ede-9d60-22786afec81d" (UID: "fc7fb4bf-d5ea-4ede-9d60-22786afec81d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.878642 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g" (OuterVolumeSpecName: "kube-api-access-hgk8g") pod "fc7fb4bf-d5ea-4ede-9d60-22786afec81d" (UID: "fc7fb4bf-d5ea-4ede-9d60-22786afec81d"). InnerVolumeSpecName "kube-api-access-hgk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.975092 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.975139 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgk8g\" (UniqueName: \"kubernetes.io/projected/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-kube-api-access-hgk8g\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.975161 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.975180 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:08 crc kubenswrapper[4717]: I0308 05:48:08.975197 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc7fb4bf-d5ea-4ede-9d60-22786afec81d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.097538 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-drf74" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.097592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-685d956dff-kcqnq" event={"ID":"fc7fb4bf-d5ea-4ede-9d60-22786afec81d","Type":"ContainerDied","Data":"55088ea055960e978e4c6e21e0da66a3fc4d84d642155686a4387e374ed0d75b"} Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.097676 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-685d956dff-kcqnq" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.097931 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: E0308 05:48:09.100970 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current\\\"\"" pod="openstack/barbican-db-sync-rlvrs" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.174876 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.199129 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.222906 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:48:09 crc kubenswrapper[4717]: E0308 05:48:09.223314 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-log" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.223330 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-log" Mar 08 05:48:09 crc kubenswrapper[4717]: E0308 05:48:09.223340 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-httpd" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.223346 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-httpd" Mar 08 05:48:09 crc kubenswrapper[4717]: E0308 05:48:09.223361 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c8cd6e-8f2d-4c20-9fbf-335c620a898e" containerName="keystone-bootstrap" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.223367 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c8cd6e-8f2d-4c20-9fbf-335c620a898e" containerName="keystone-bootstrap" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.223966 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-httpd" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.223990 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c8cd6e-8f2d-4c20-9fbf-335c620a898e" containerName="keystone-bootstrap" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.224006 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" containerName="glance-log" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.225055 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.227279 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.227495 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.252017 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.270881 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-685d956dff-kcqnq"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.280980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52s48\" (UniqueName: \"kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281105 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281150 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281219 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.281235 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.284943 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.324661 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-drf74"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.331419 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-drf74"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.379663 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j9lc2"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52s48\" (UniqueName: \"kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383381 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.383517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.388279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.388828 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.388942 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.389249 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z9bhb" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.389391 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.389488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.390058 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.390230 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.391276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.392053 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j9lc2"] Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.394040 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.410107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.421874 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52s48\" (UniqueName: \"kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.423881 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.430097 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485027 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485390 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljnb8\" (UniqueName: \"kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.485631 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.580400 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.587832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljnb8\" (UniqueName: \"kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.587884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.587921 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.587987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.588033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.588080 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.593544 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.593655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.594129 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.598241 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.598666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.609864 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljnb8\" (UniqueName: \"kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8\") pod \"keystone-bootstrap-j9lc2\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.780344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.798653 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c8cd6e-8f2d-4c20-9fbf-335c620a898e" path="/var/lib/kubelet/pods/02c8cd6e-8f2d-4c20-9fbf-335c620a898e/volumes" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.799448 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce22e317-1f45-443f-b495-0c9b297dc721" path="/var/lib/kubelet/pods/ce22e317-1f45-443f-b495-0c9b297dc721/volumes" Mar 08 05:48:09 crc kubenswrapper[4717]: I0308 05:48:09.800107 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7fb4bf-d5ea-4ede-9d60-22786afec81d" path="/var/lib/kubelet/pods/fc7fb4bf-d5ea-4ede-9d60-22786afec81d/volumes" Mar 08 05:48:10 crc kubenswrapper[4717]: E0308 05:48:10.219278 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Mar 08 05:48:10 crc kubenswrapper[4717]: E0308 05:48:10.219834 4717 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Mar 08 05:48:10 crc kubenswrapper[4717]: E0308 05:48:10.220041 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwzn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-66q5h_openstack(ec6c6686-44c7-49ec-950b-7054d96e207d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 05:48:10 crc kubenswrapper[4717]: E0308 05:48:10.252259 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-66q5h" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.699974 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.735092 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.756448 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs\") pod \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824185 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdv6r\" (UniqueName: \"kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r\") pod \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824220 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key\") pod \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824240 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs\") pod \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824258 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data\") pod \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824317 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts\") pod \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data\") pod \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824413 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824451 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zvw\" (UniqueName: \"kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824466 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghrb5\" (UniqueName: \"kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5\") pod \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824481 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts\") pod \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\" (UID: \"a5d8149f-3efd-47b3-a228-bd47a3bf2073\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key\") pod \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\" (UID: \"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.824556 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb\") pod \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\" (UID: \"9b2a46ff-0442-4a0e-b8da-7c24b20df3da\") " Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.827024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs" (OuterVolumeSpecName: "logs") pod "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" (UID: "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.827243 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs" (OuterVolumeSpecName: "logs") pod "a5d8149f-3efd-47b3-a228-bd47a3bf2073" (UID: "a5d8149f-3efd-47b3-a228-bd47a3bf2073"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.827837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts" (OuterVolumeSpecName: "scripts") pod "a5d8149f-3efd-47b3-a228-bd47a3bf2073" (UID: "a5d8149f-3efd-47b3-a228-bd47a3bf2073"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.828200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data" (OuterVolumeSpecName: "config-data") pod "a5d8149f-3efd-47b3-a228-bd47a3bf2073" (UID: "a5d8149f-3efd-47b3-a228-bd47a3bf2073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.828307 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts" (OuterVolumeSpecName: "scripts") pod "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" (UID: "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.828432 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data" (OuterVolumeSpecName: "config-data") pod "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" (UID: "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.832162 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw" (OuterVolumeSpecName: "kube-api-access-g8zvw") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "kube-api-access-g8zvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.832355 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a5d8149f-3efd-47b3-a228-bd47a3bf2073" (UID: "a5d8149f-3efd-47b3-a228-bd47a3bf2073"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.833330 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" (UID: "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.834356 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r" (OuterVolumeSpecName: "kube-api-access-fdv6r") pod "a5d8149f-3efd-47b3-a228-bd47a3bf2073" (UID: "a5d8149f-3efd-47b3-a228-bd47a3bf2073"). InnerVolumeSpecName "kube-api-access-fdv6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.838020 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5" (OuterVolumeSpecName: "kube-api-access-ghrb5") pod "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" (UID: "1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0"). InnerVolumeSpecName "kube-api-access-ghrb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.884087 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.889261 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.909131 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.915222 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config" (OuterVolumeSpecName: "config") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928294 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928623 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928637 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928646 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928655 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zvw\" (UniqueName: \"kubernetes.io/projected/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-kube-api-access-g8zvw\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928664 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghrb5\" (UniqueName: \"kubernetes.io/projected/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-kube-api-access-ghrb5\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928673 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928681 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928704 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928711 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928719 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5d8149f-3efd-47b3-a228-bd47a3bf2073-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928729 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdv6r\" (UniqueName: \"kubernetes.io/projected/a5d8149f-3efd-47b3-a228-bd47a3bf2073-kube-api-access-fdv6r\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928850 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5d8149f-3efd-47b3-a228-bd47a3bf2073-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928864 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.928873 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5d8149f-3efd-47b3-a228-bd47a3bf2073-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:10 crc kubenswrapper[4717]: I0308 05:48:10.940185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b2a46ff-0442-4a0e-b8da-7c24b20df3da" (UID: "9b2a46ff-0442-4a0e-b8da-7c24b20df3da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.031958 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b2a46ff-0442-4a0e-b8da-7c24b20df3da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.126773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wvxkm" event={"ID":"ffc84338-48ac-4538-b134-5993d5a9f91c","Type":"ContainerStarted","Data":"8f8d69e9d6cc8b05c2041743600d3eaa85202c84e62d77a0f17afb206daa3902"} Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.129196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerStarted","Data":"56d6adbe1a47bc739f60662d8c5fcff494f54d3c7e7935ce9d54cdbdb0582939"} Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.130893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerStarted","Data":"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836"} Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.133100 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" event={"ID":"9b2a46ff-0442-4a0e-b8da-7c24b20df3da","Type":"ContainerDied","Data":"65178bab378d5202ac4cda5b1fd2d15cd835d51debc976efc60c7782189590aa"} Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.133146 4717 scope.go:117] "RemoveContainer" containerID="2a74ed5fd2bc4bf0d6a0c5137e41e1810d7da7df39de8f7139ead1105a730543" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.133247 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf4dd7b85-w8lmr" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.134616 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8fbc4b46c-9rg6p" event={"ID":"a5d8149f-3efd-47b3-a228-bd47a3bf2073","Type":"ContainerDied","Data":"b883748a24317a4a8e9ac7dec3308982ad042b52c4a9ac24f37830a0627d58d6"} Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.134735 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8fbc4b46c-9rg6p" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.147415 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-556b65bf97-x8664" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.148061 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-556b65bf97-x8664" event={"ID":"1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0","Type":"ContainerDied","Data":"e98042916b9c3156f8aacd514be6b9ab929d2ccd043b2f4303d6bc42a46f400d"} Mar 08 05:48:11 crc kubenswrapper[4717]: E0308 05:48:11.152086 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current\\\"\"" pod="openstack/cinder-db-sync-66q5h" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.152896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.153637 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wvxkm" podStartSLOduration=3.324284623 podStartE2EDuration="31.153620638s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="2026-03-08 05:47:42.329717493 +0000 UTC m=+1289.247366337" lastFinishedPulling="2026-03-08 05:48:10.159053498 +0000 UTC m=+1317.076702352" observedRunningTime="2026-03-08 05:48:11.141252614 +0000 UTC m=+1318.058901458" watchObservedRunningTime="2026-03-08 05:48:11.153620638 +0000 UTC m=+1318.071269482" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.197222 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.205373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.218292 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.223575 4717 scope.go:117] "RemoveContainer" containerID="868d3357f9dd8d9db5a03de86cdd4c286b5649a56a4ff0d86e23bc3082e4bec7" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.268510 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69bcb664dd-nb94m"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.299762 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549148-nrst9"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.300721 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j9lc2"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.348399 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.453435 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.461015 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf4dd7b85-w8lmr"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.513122 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.523496 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-556b65bf97-x8664"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.546986 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.553597 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8fbc4b46c-9rg6p"] Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.793895 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0" path="/var/lib/kubelet/pods/1ed8c343-c8a5-4b4c-8766-dfb98c78d0e0/volumes" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.794821 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" path="/var/lib/kubelet/pods/9b2a46ff-0442-4a0e-b8da-7c24b20df3da/volumes" Mar 08 05:48:11 crc kubenswrapper[4717]: I0308 05:48:11.795863 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d8149f-3efd-47b3-a228-bd47a3bf2073" path="/var/lib/kubelet/pods/a5d8149f-3efd-47b3-a228-bd47a3bf2073/volumes" Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.159882 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6be711ba-e0dc-4d84-a9d3-910819cc02e3","Type":"ContainerStarted","Data":"52cee56da04edbfd0799ffef5e0f21c28e88a1592e4cd394ea7ffbccc5ea89a2"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.162337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerStarted","Data":"1d9b97bcd77096f12879669bbd083a51cf198a9747f27cd3730189e502557834"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.162403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerStarted","Data":"4f45f9c20ff0b17f2f862b411f18740f87fa5ba961d64c443a5261047bc799be"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.162789 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.164288 4717 generic.go:334] "Generic (PLEG): container finished" podID="818627ad-f6eb-43d2-adfc-7daacc7f9b6f" containerID="ed51d405b2bc45be9abcfa817172cb6ff7c1fd2dcc3a0ada30ccd6707e23e39b" exitCode=0 Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.164393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-htmd2" event={"ID":"818627ad-f6eb-43d2-adfc-7daacc7f9b6f","Type":"ContainerDied","Data":"ed51d405b2bc45be9abcfa817172cb6ff7c1fd2dcc3a0ada30ccd6707e23e39b"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.168090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j9lc2" event={"ID":"0ac4bbfc-288a-451e-8f03-864b4b2cb96e","Type":"ContainerStarted","Data":"207480c3692c1d3bfed26d5c8ae0a4b4e7e835919d7acf36b2e52514805ce2eb"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.168149 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j9lc2" event={"ID":"0ac4bbfc-288a-451e-8f03-864b4b2cb96e","Type":"ContainerStarted","Data":"7dbe4bde86b93074065ca1934198a3249ed224557630cd0d641417f9ab6547ca"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.170184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549148-nrst9" event={"ID":"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5","Type":"ContainerStarted","Data":"ce0508c056b05a738b55eac5ccf6d276ac47ad31548ef6246b87b57398a34552"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.171819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69bcb664dd-nb94m" event={"ID":"9ab815c4-1b4d-499a-af69-f5e5907c9542","Type":"ContainerStarted","Data":"d2e9c0c6898285c45d194cb969b745aa09e117097adefb8379ddaa86d091308e"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.171849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69bcb664dd-nb94m" event={"ID":"9ab815c4-1b4d-499a-af69-f5e5907c9542","Type":"ContainerStarted","Data":"a854f6fe4d692a6a0dd493c9e86c8c97e28ede9207b69ee3006c5ed7871573d1"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.172811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerStarted","Data":"3b812ea7dff38aec51591b7b4b9d2de2ad9d1d7c5cb7c572f9bf8c1b89923f4f"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.174281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerStarted","Data":"c1917f2759c86ee9721919a30a060871204b566f474f611a312fb765bec38dd4"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.174338 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerStarted","Data":"290261224c0ce51d3ca223683b9a0c4fd3f00efb6e784f88a4489b513857d02c"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.176749 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerStarted","Data":"b36ebddd35b1cbdbd3caf2444fbe06303ff15540d635bab9faab19b96f3aa94f"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.181847 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": dial tcp 10.217.0.169:9322: connect: connection refused" Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.182992 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=24.182974714 podStartE2EDuration="24.182974714s" podCreationTimestamp="2026-03-08 05:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:12.179488798 +0000 UTC m=+1319.097137652" watchObservedRunningTime="2026-03-08 05:48:12.182974714 +0000 UTC m=+1319.100623558" Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.185068 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-log" containerID="cri-o://9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" gracePeriod=30 Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.185189 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerStarted","Data":"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d"} Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.185267 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-httpd" containerID="cri-o://c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" gracePeriod=30 Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.219503 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j9lc2" podStartSLOduration=3.219488413 podStartE2EDuration="3.219488413s" podCreationTimestamp="2026-03-08 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:12.21491752 +0000 UTC m=+1319.132566364" watchObservedRunningTime="2026-03-08 05:48:12.219488413 +0000 UTC m=+1319.137137257" Mar 08 05:48:12 crc kubenswrapper[4717]: I0308 05:48:12.244423 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.244405676 podStartE2EDuration="26.244405676s" podCreationTimestamp="2026-03-08 05:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:12.237971548 +0000 UTC m=+1319.155620422" watchObservedRunningTime="2026-03-08 05:48:12.244405676 +0000 UTC m=+1319.162054520" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.055368 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.182551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.182622 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.182886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.182948 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.183334 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.183477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n45h\" (UniqueName: \"kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.183570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.183602 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.183642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs\") pod \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\" (UID: \"f8132755-0c53-4feb-80d9-1e86d52d0ea8\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.184194 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.184333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs" (OuterVolumeSpecName: "logs") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.192741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h" (OuterVolumeSpecName: "kube-api-access-7n45h") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "kube-api-access-7n45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.195234 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.212103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts" (OuterVolumeSpecName: "scripts") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.223113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerStarted","Data":"20143731b1089e4e5df5f39c5474d218cc338bc7f42a3ac3401f869d4dfd0375"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236658 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerID="c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" exitCode=0 Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236729 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerID="9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" exitCode=143 Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236781 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerDied","Data":"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerDied","Data":"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f8132755-0c53-4feb-80d9-1e86d52d0ea8","Type":"ContainerDied","Data":"e3e985bcbceae0c9e92b492d0723d87e9693ee7de0b8b15c3a5665f2ad8db155"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236829 4717 scope.go:117] "RemoveContainer" containerID="c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.236861 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.242355 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerStarted","Data":"0d73df4708da23d730f3975f3149f1ab5449833fb5c2db88e445f08159ce201d"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.245292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerStarted","Data":"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200"} Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.247418 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.278996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data" (OuterVolumeSpecName: "config-data") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.284727 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8132755-0c53-4feb-80d9-1e86d52d0ea8" (UID: "f8132755-0c53-4feb-80d9-1e86d52d0ea8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285806 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285836 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n45h\" (UniqueName: \"kubernetes.io/projected/f8132755-0c53-4feb-80d9-1e86d52d0ea8-kube-api-access-7n45h\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285850 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285863 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285873 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8132755-0c53-4feb-80d9-1e86d52d0ea8-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285883 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8132755-0c53-4feb-80d9-1e86d52d0ea8-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.285909 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.325284 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.387462 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.462300 4717 scope.go:117] "RemoveContainer" containerID="9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.502417 4717 scope.go:117] "RemoveContainer" containerID="c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.508066 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d\": container with ID starting with c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d not found: ID does not exist" containerID="c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.508112 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d"} err="failed to get container status \"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d\": rpc error: code = NotFound desc = could not find container \"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d\": container with ID starting with c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d not found: ID does not exist" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.508155 4717 scope.go:117] "RemoveContainer" containerID="9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.508748 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836\": container with ID starting with 9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836 not found: ID does not exist" containerID="9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.508794 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836"} err="failed to get container status \"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836\": rpc error: code = NotFound desc = could not find container \"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836\": container with ID starting with 9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836 not found: ID does not exist" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.508822 4717 scope.go:117] "RemoveContainer" containerID="c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.509247 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d"} err="failed to get container status \"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d\": rpc error: code = NotFound desc = could not find container \"c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d\": container with ID starting with c4b17339d808b23324a000cb09543d91a42967faf504ca50189dce06bb58be3d not found: ID does not exist" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.509284 4717 scope.go:117] "RemoveContainer" containerID="9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.509651 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836"} err="failed to get container status \"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836\": rpc error: code = NotFound desc = could not find container \"9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836\": container with ID starting with 9463ed022fbacf921ccf9b7d51ffb5c6f50d0acecbee471940b9449b3e023836 not found: ID does not exist" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.607873 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.636666 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-htmd2" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.650153 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.657082 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.689832 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.690368 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-httpd" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690388 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-httpd" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.690423 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690432 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.690445 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="init" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690451 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="init" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.690460 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818627ad-f6eb-43d2-adfc-7daacc7f9b6f" containerName="neutron-db-sync" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690466 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="818627ad-f6eb-43d2-adfc-7daacc7f9b6f" containerName="neutron-db-sync" Mar 08 05:48:13 crc kubenswrapper[4717]: E0308 05:48:13.690500 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-log" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690507 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-log" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690729 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b2a46ff-0442-4a0e-b8da-7c24b20df3da" containerName="dnsmasq-dns" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690747 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-log" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690774 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="818627ad-f6eb-43d2-adfc-7daacc7f9b6f" containerName="neutron-db-sync" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.690784 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" containerName="glance-httpd" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.692143 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.695985 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.696163 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.697937 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.793296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle\") pod \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.794349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w24m\" (UniqueName: \"kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m\") pod \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.794414 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config\") pod \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\" (UID: \"818627ad-f6eb-43d2-adfc-7daacc7f9b6f\") " Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.795787 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.796624 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.796698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.796758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.796884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjc26\" (UniqueName: \"kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.797051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.797122 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.797374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.801146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m" (OuterVolumeSpecName: "kube-api-access-8w24m") pod "818627ad-f6eb-43d2-adfc-7daacc7f9b6f" (UID: "818627ad-f6eb-43d2-adfc-7daacc7f9b6f"). InnerVolumeSpecName "kube-api-access-8w24m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.811847 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8132755-0c53-4feb-80d9-1e86d52d0ea8" path="/var/lib/kubelet/pods/f8132755-0c53-4feb-80d9-1e86d52d0ea8/volumes" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.864573 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "818627ad-f6eb-43d2-adfc-7daacc7f9b6f" (UID: "818627ad-f6eb-43d2-adfc-7daacc7f9b6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.873882 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config" (OuterVolumeSpecName: "config") pod "818627ad-f6eb-43d2-adfc-7daacc7f9b6f" (UID: "818627ad-f6eb-43d2-adfc-7daacc7f9b6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.901849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.901894 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.901946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.901999 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjc26\" (UniqueName: \"kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902033 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902106 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902144 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902188 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902199 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w24m\" (UniqueName: \"kubernetes.io/projected/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-kube-api-access-8w24m\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.902209 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/818627ad-f6eb-43d2-adfc-7daacc7f9b6f-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.904358 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.904991 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.905108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.905178 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.905513 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.906937 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.916259 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.928401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.936382 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjc26\" (UniqueName: \"kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.940412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:13 crc kubenswrapper[4717]: I0308 05:48:13.982712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " pod="openstack/glance-default-external-api-0" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.012600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.262718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerStarted","Data":"425e9f3a695ff87478add2c75e6830e7d77823e6940c41a52102c0847b13e48b"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.291319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549148-nrst9" event={"ID":"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5","Type":"ContainerStarted","Data":"14cd17fcca283abbc714961f8eb0a5659f169538f6fdbff2f48cbae56f3280d9"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.292249 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=24.373758902 podStartE2EDuration="26.292223461s" podCreationTimestamp="2026-03-08 05:47:48 +0000 UTC" firstStartedPulling="2026-03-08 05:48:11.190981038 +0000 UTC m=+1318.108629882" lastFinishedPulling="2026-03-08 05:48:13.109445577 +0000 UTC m=+1320.027094441" observedRunningTime="2026-03-08 05:48:14.275289194 +0000 UTC m=+1321.192938028" watchObservedRunningTime="2026-03-08 05:48:14.292223461 +0000 UTC m=+1321.209872305" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.316319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69bcb664dd-nb94m" event={"ID":"9ab815c4-1b4d-499a-af69-f5e5907c9542","Type":"ContainerStarted","Data":"2704df8d04ab61f423d33732625f60e22828174e0cc6615608183e66ac0ebc36"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.328442 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549148-nrst9" podStartSLOduration=12.484934661 podStartE2EDuration="14.328423523s" podCreationTimestamp="2026-03-08 05:48:00 +0000 UTC" firstStartedPulling="2026-03-08 05:48:11.246647749 +0000 UTC m=+1318.164296593" lastFinishedPulling="2026-03-08 05:48:13.090136611 +0000 UTC m=+1320.007785455" observedRunningTime="2026-03-08 05:48:14.319869472 +0000 UTC m=+1321.237518326" watchObservedRunningTime="2026-03-08 05:48:14.328423523 +0000 UTC m=+1321.246072357" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.330059 4717 generic.go:334] "Generic (PLEG): container finished" podID="ffc84338-48ac-4538-b134-5993d5a9f91c" containerID="8f8d69e9d6cc8b05c2041743600d3eaa85202c84e62d77a0f17afb206daa3902" exitCode=0 Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.330160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wvxkm" event={"ID":"ffc84338-48ac-4538-b134-5993d5a9f91c","Type":"ContainerDied","Data":"8f8d69e9d6cc8b05c2041743600d3eaa85202c84e62d77a0f17afb206daa3902"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.341858 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69bcb664dd-nb94m" podStartSLOduration=25.122509242 podStartE2EDuration="25.341842703s" podCreationTimestamp="2026-03-08 05:47:49 +0000 UTC" firstStartedPulling="2026-03-08 05:48:11.195823437 +0000 UTC m=+1318.113472281" lastFinishedPulling="2026-03-08 05:48:11.415156898 +0000 UTC m=+1318.332805742" observedRunningTime="2026-03-08 05:48:14.337045455 +0000 UTC m=+1321.254694299" watchObservedRunningTime="2026-03-08 05:48:14.341842703 +0000 UTC m=+1321.259491547" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.355944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerStarted","Data":"7f16a619842332821727a9d6674b333242122ffe323c117935455e4b11b303a9"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.386049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6be711ba-e0dc-4d84-a9d3-910819cc02e3","Type":"ContainerStarted","Data":"15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.424271 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-htmd2" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.424262 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-htmd2" event={"ID":"818627ad-f6eb-43d2-adfc-7daacc7f9b6f","Type":"ContainerDied","Data":"ad179de5658abc4d3bc7e1dd85d8b9d92ac86e7c830aa870559efc402947430b"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.444729 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad179de5658abc4d3bc7e1dd85d8b9d92ac86e7c830aa870559efc402947430b" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.433576 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.433553981 podStartE2EDuration="5.433553981s" podCreationTimestamp="2026-03-08 05:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:14.391326752 +0000 UTC m=+1321.308975586" watchObservedRunningTime="2026-03-08 05:48:14.433553981 +0000 UTC m=+1321.351202815" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.457448 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerStarted","Data":"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67"} Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.510176 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=24.624204989 podStartE2EDuration="26.510154797s" podCreationTimestamp="2026-03-08 05:47:48 +0000 UTC" firstStartedPulling="2026-03-08 05:48:11.210788616 +0000 UTC m=+1318.128437460" lastFinishedPulling="2026-03-08 05:48:13.096738384 +0000 UTC m=+1320.014387268" observedRunningTime="2026-03-08 05:48:14.420987142 +0000 UTC m=+1321.338635986" watchObservedRunningTime="2026-03-08 05:48:14.510154797 +0000 UTC m=+1321.427803641" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.534769 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.536551 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.547856 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.551879 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cf759c7cb-qxb65" podStartSLOduration=24.880075533 podStartE2EDuration="25.551864774s" podCreationTimestamp="2026-03-08 05:47:49 +0000 UTC" firstStartedPulling="2026-03-08 05:48:11.223416677 +0000 UTC m=+1318.141065521" lastFinishedPulling="2026-03-08 05:48:11.895205918 +0000 UTC m=+1318.812854762" observedRunningTime="2026-03-08 05:48:14.505668147 +0000 UTC m=+1321.423316991" watchObservedRunningTime="2026-03-08 05:48:14.551864774 +0000 UTC m=+1321.469513618" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.634906 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.635172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.635250 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fskv\" (UniqueName: \"kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.635341 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.635419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.654040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.684910 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.686522 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.695847 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.696000 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.696145 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.696262 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xh2tz" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.700725 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756566 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fskv\" (UniqueName: \"kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zjwp\" (UniqueName: \"kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.756792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.757508 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.757564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.758885 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.758968 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.759283 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.759437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.788068 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fskv\" (UniqueName: \"kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv\") pod \"dnsmasq-dns-56bd65676f-5n794\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.870268 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.871215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.871277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zjwp\" (UniqueName: \"kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.871366 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.871400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.880526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.881018 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.886495 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.888768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.890171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:14 crc kubenswrapper[4717]: I0308 05:48:14.904780 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zjwp\" (UniqueName: \"kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp\") pod \"neutron-6fd6c6f4-sgbls\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.024219 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.483527 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.510611 4717 generic.go:334] "Generic (PLEG): container finished" podID="90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" containerID="14cd17fcca283abbc714961f8eb0a5659f169538f6fdbff2f48cbae56f3280d9" exitCode=0 Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.510664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549148-nrst9" event={"ID":"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5","Type":"ContainerDied","Data":"14cd17fcca283abbc714961f8eb0a5659f169538f6fdbff2f48cbae56f3280d9"} Mar 08 05:48:15 crc kubenswrapper[4717]: W0308 05:48:15.511945 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b525ae_0a3d_41ba_b961_2e1fecce18b9.slice/crio-273f4198448dec3e01eec0537e568a2ca2ebcbd25270b5e41e2fa06407a40284 WatchSource:0}: Error finding container 273f4198448dec3e01eec0537e568a2ca2ebcbd25270b5e41e2fa06407a40284: Status 404 returned error can't find the container with id 273f4198448dec3e01eec0537e568a2ca2ebcbd25270b5e41e2fa06407a40284 Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.516282 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:48:15 crc kubenswrapper[4717]: I0308 05:48:15.517023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerStarted","Data":"ff8a472ff8d216e15089e5c35fb29cfbc3f9d5958266b97e84c498518a10c9cd"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.002590 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:48:16 crc kubenswrapper[4717]: W0308 05:48:16.029474 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9407de8_78ab_4bf1_9f53_49e71656898a.slice/crio-0f6b2f069f6cf7065d1f8c91351da964b1870e6dad2b1c91239a0be75c31e9de WatchSource:0}: Error finding container 0f6b2f069f6cf7065d1f8c91351da964b1870e6dad2b1c91239a0be75c31e9de: Status 404 returned error can't find the container with id 0f6b2f069f6cf7065d1f8c91351da964b1870e6dad2b1c91239a0be75c31e9de Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.032657 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wvxkm" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.215998 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fclx\" (UniqueName: \"kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx\") pod \"ffc84338-48ac-4538-b134-5993d5a9f91c\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.216132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data\") pod \"ffc84338-48ac-4538-b134-5993d5a9f91c\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.216246 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts\") pod \"ffc84338-48ac-4538-b134-5993d5a9f91c\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.216279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs\") pod \"ffc84338-48ac-4538-b134-5993d5a9f91c\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.216356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle\") pod \"ffc84338-48ac-4538-b134-5993d5a9f91c\" (UID: \"ffc84338-48ac-4538-b134-5993d5a9f91c\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.216923 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs" (OuterVolumeSpecName: "logs") pod "ffc84338-48ac-4538-b134-5993d5a9f91c" (UID: "ffc84338-48ac-4538-b134-5993d5a9f91c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.220061 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffc84338-48ac-4538-b134-5993d5a9f91c-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.220633 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts" (OuterVolumeSpecName: "scripts") pod "ffc84338-48ac-4538-b134-5993d5a9f91c" (UID: "ffc84338-48ac-4538-b134-5993d5a9f91c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.220757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx" (OuterVolumeSpecName: "kube-api-access-2fclx") pod "ffc84338-48ac-4538-b134-5993d5a9f91c" (UID: "ffc84338-48ac-4538-b134-5993d5a9f91c"). InnerVolumeSpecName "kube-api-access-2fclx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.255741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data" (OuterVolumeSpecName: "config-data") pod "ffc84338-48ac-4538-b134-5993d5a9f91c" (UID: "ffc84338-48ac-4538-b134-5993d5a9f91c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.258819 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffc84338-48ac-4538-b134-5993d5a9f91c" (UID: "ffc84338-48ac-4538-b134-5993d5a9f91c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.320717 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.320749 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.320760 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fclx\" (UniqueName: \"kubernetes.io/projected/ffc84338-48ac-4538-b134-5993d5a9f91c-kube-api-access-2fclx\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.320770 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc84338-48ac-4538-b134-5993d5a9f91c-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.526791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerStarted","Data":"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.527073 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerStarted","Data":"273f4198448dec3e01eec0537e568a2ca2ebcbd25270b5e41e2fa06407a40284"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.528434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerStarted","Data":"9af55bd54a70d80bac5b48e8d5331e1d3df92c4f2ea98306d143c2816018a4ea"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.533763 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerStarted","Data":"0f6b2f069f6cf7065d1f8c91351da964b1870e6dad2b1c91239a0be75c31e9de"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.536154 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wvxkm" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.536139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wvxkm" event={"ID":"ffc84338-48ac-4538-b134-5993d5a9f91c","Type":"ContainerDied","Data":"50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb"} Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.536209 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50df3aaa63164f5805a4ae45d7fe9a34e320c62d31efb087cbac5d52587e9dfb" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.536972 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:48:16 crc kubenswrapper[4717]: E0308 05:48:16.537357 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc84338-48ac-4538-b134-5993d5a9f91c" containerName="placement-db-sync" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.537375 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc84338-48ac-4538-b134-5993d5a9f91c" containerName="placement-db-sync" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.537596 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc84338-48ac-4538-b134-5993d5a9f91c" containerName="placement-db-sync" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.541864 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.544757 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.544966 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.545058 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-762pl" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.545305 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.546167 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.555771 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.726661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.726931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.726977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.726999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.727020 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.727044 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psf4r\" (UniqueName: \"kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.727070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829159 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829260 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psf4r\" (UniqueName: \"kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829358 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.829400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.830043 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.835619 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.835808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.838083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.839159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.842224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.848214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psf4r\" (UniqueName: \"kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r\") pod \"placement-7755f67488-mclxw\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.871594 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.907761 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.930817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px5m4\" (UniqueName: \"kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4\") pod \"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5\" (UID: \"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5\") " Mar 08 05:48:16 crc kubenswrapper[4717]: I0308 05:48:16.935878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4" (OuterVolumeSpecName: "kube-api-access-px5m4") pod "90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" (UID: "90a9f655-96fd-4f95-bbd3-5bbf8db0faa5"). InnerVolumeSpecName "kube-api-access-px5m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.032181 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px5m4\" (UniqueName: \"kubernetes.io/projected/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5-kube-api-access-px5m4\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.205143 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.547507 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerID="4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a" exitCode=0 Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.547548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerDied","Data":"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.554798 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ac4bbfc-288a-451e-8f03-864b4b2cb96e" containerID="207480c3692c1d3bfed26d5c8ae0a4b4e7e835919d7acf36b2e52514805ce2eb" exitCode=0 Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.554860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j9lc2" event={"ID":"0ac4bbfc-288a-451e-8f03-864b4b2cb96e","Type":"ContainerDied","Data":"207480c3692c1d3bfed26d5c8ae0a4b4e7e835919d7acf36b2e52514805ce2eb"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.558758 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549148-nrst9" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.559171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549148-nrst9" event={"ID":"90a9f655-96fd-4f95-bbd3-5bbf8db0faa5","Type":"ContainerDied","Data":"ce0508c056b05a738b55eac5ccf6d276ac47ad31548ef6246b87b57398a34552"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.559201 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce0508c056b05a738b55eac5ccf6d276ac47ad31548ef6246b87b57398a34552" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.577969 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerStarted","Data":"3e62a802a83a0fd7cc70d85b5b7b1816de1c84f06b89d893888a3028192bccb3"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.578013 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerStarted","Data":"b6b164a88de189fae0dae7303712cb2904ac004f920d737bdf1c3d765a634eca"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.579786 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.581728 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerStarted","Data":"6de8c89c19544c253cd53840e8f6e5ed0b5a42c35f1ea539971f1d57e8f856b4"} Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.606121 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fd6c6f4-sgbls" podStartSLOduration=3.606106421 podStartE2EDuration="3.606106421s" podCreationTimestamp="2026-03-08 05:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:17.604232825 +0000 UTC m=+1324.521881669" watchObservedRunningTime="2026-03-08 05:48:17.606106421 +0000 UTC m=+1324.523755265" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.773742 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:17 crc kubenswrapper[4717]: E0308 05:48:17.774384 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" containerName="oc" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.774397 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" containerName="oc" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.774584 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" containerName="oc" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.775516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.778028 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.778544 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.855162 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.960909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.960956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.961023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.961066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5764\" (UniqueName: \"kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.961091 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.961149 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.961172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.970387 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549142-pp5ds"] Mar 08 05:48:17 crc kubenswrapper[4717]: I0308 05:48:17.978518 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549142-pp5ds"] Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.062938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.062976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.063029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.063072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5764\" (UniqueName: \"kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.063096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.063133 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.063155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.071013 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.071035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.071520 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.073435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.073793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.074804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.079629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5764\" (UniqueName: \"kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764\") pod \"neutron-78c779987c-vpp7g\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.116247 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.285836 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.480725 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.480961 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.528812 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.585078 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.585123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.606307 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.606446 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.612032 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerStarted","Data":"75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319"} Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.622579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerStarted","Data":"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f"} Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.622636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerStarted","Data":"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6"} Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.622698 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.622836 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.628082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerStarted","Data":"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788"} Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.643973 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.646160 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.645952075 podStartE2EDuration="5.645952075s" podCreationTimestamp="2026-03-08 05:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:18.636537153 +0000 UTC m=+1325.554185987" watchObservedRunningTime="2026-03-08 05:48:18.645952075 +0000 UTC m=+1325.563600919" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.652183 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.652914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.653066 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.662208 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56bd65676f-5n794" podStartSLOduration=4.662186385 podStartE2EDuration="4.662186385s" podCreationTimestamp="2026-03-08 05:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:18.658957486 +0000 UTC m=+1325.576606320" watchObservedRunningTime="2026-03-08 05:48:18.662186385 +0000 UTC m=+1325.579835229" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.693921 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7755f67488-mclxw" podStartSLOduration=2.6939032960000002 podStartE2EDuration="2.693903296s" podCreationTimestamp="2026-03-08 05:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:18.682922766 +0000 UTC m=+1325.600571610" watchObservedRunningTime="2026-03-08 05:48:18.693903296 +0000 UTC m=+1325.611552140" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.707548 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:18 crc kubenswrapper[4717]: I0308 05:48:18.815752 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.581217 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.581262 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.624915 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.634124 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.635816 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.635842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.635851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.648362 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.696961 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Mar 08 05:48:19 crc kubenswrapper[4717]: I0308 05:48:19.802619 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db22cf04-a81d-4990-8f88-41147bf3f149" path="/var/lib/kubelet/pods/db22cf04-a81d-4990-8f88-41147bf3f149/volumes" Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.102908 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.103733 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.183600 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.183647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.657259 4717 generic.go:334] "Generic (PLEG): container finished" podID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerID="425e9f3a695ff87478add2c75e6830e7d77823e6940c41a52102c0847b13e48b" exitCode=1 Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.657466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerDied","Data":"425e9f3a695ff87478add2c75e6830e7d77823e6940c41a52102c0847b13e48b"} Mar 08 05:48:20 crc kubenswrapper[4717]: I0308 05:48:20.659284 4717 scope.go:117] "RemoveContainer" containerID="425e9f3a695ff87478add2c75e6830e7d77823e6940c41a52102c0847b13e48b" Mar 08 05:48:21 crc kubenswrapper[4717]: I0308 05:48:21.665150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerStarted","Data":"0dd173f4f4c9a480972f322245910443c573d1c7eab06887b5cc282ed79c179d"} Mar 08 05:48:21 crc kubenswrapper[4717]: I0308 05:48:21.665430 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:48:21 crc kubenswrapper[4717]: I0308 05:48:21.665465 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.593942 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.603942 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.686373 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j9lc2" event={"ID":"0ac4bbfc-288a-451e-8f03-864b4b2cb96e","Type":"ContainerDied","Data":"7dbe4bde86b93074065ca1934198a3249ed224557630cd0d641417f9ab6547ca"} Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.686442 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dbe4bde86b93074065ca1934198a3249ed224557630cd0d641417f9ab6547ca" Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.901175 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.901606 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api-log" containerID="cri-o://1d9b97bcd77096f12879669bbd083a51cf198a9747f27cd3730189e502557834" gracePeriod=30 Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.901997 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" containerID="cri-o://0d73df4708da23d730f3975f3149f1ab5449833fb5c2db88e445f08159ce201d" gracePeriod=30 Mar 08 05:48:22 crc kubenswrapper[4717]: I0308 05:48:22.940329 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076380 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076411 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.076470 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljnb8\" (UniqueName: \"kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8\") pod \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\" (UID: \"0ac4bbfc-288a-451e-8f03-864b4b2cb96e\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.088671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts" (OuterVolumeSpecName: "scripts") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.091443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.094423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.098075 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8" (OuterVolumeSpecName: "kube-api-access-ljnb8") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "kube-api-access-ljnb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.126757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.136823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data" (OuterVolumeSpecName: "config-data") pod "0ac4bbfc-288a-451e-8f03-864b4b2cb96e" (UID: "0ac4bbfc-288a-451e-8f03-864b4b2cb96e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178858 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178889 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178898 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178907 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljnb8\" (UniqueName: \"kubernetes.io/projected/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-kube-api-access-ljnb8\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178919 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.178928 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ac4bbfc-288a-451e-8f03-864b4b2cb96e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.709500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rlvrs" event={"ID":"6c734bf7-1916-4a47-93e0-42caaaced812","Type":"ContainerStarted","Data":"6d863e6acd09b337b0c38622de8d4fcc696d4705ecc9d106fceb5a7e9490d0c4"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.712593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerStarted","Data":"51a17d9acbc55b1122dcd14627309e3fda8cdc98435b465ccbc287efb5d05b49"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.717489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerStarted","Data":"cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.722256 4717 generic.go:334] "Generic (PLEG): container finished" podID="62b1d992-0078-406b-ade7-6710e9a62c96" containerID="0d73df4708da23d730f3975f3149f1ab5449833fb5c2db88e445f08159ce201d" exitCode=0 Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.722290 4717 generic.go:334] "Generic (PLEG): container finished" podID="62b1d992-0078-406b-ade7-6710e9a62c96" containerID="1d9b97bcd77096f12879669bbd083a51cf198a9747f27cd3730189e502557834" exitCode=143 Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.722326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerDied","Data":"0d73df4708da23d730f3975f3149f1ab5449833fb5c2db88e445f08159ce201d"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.722359 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerDied","Data":"1d9b97bcd77096f12879669bbd083a51cf198a9747f27cd3730189e502557834"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.726600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerStarted","Data":"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.726633 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerStarted","Data":"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce"} Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.726647 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.726701 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j9lc2" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.758184 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rlvrs" podStartSLOduration=3.334576696 podStartE2EDuration="43.758164505s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="2026-03-08 05:47:42.335224479 +0000 UTC m=+1289.252873323" lastFinishedPulling="2026-03-08 05:48:22.758812288 +0000 UTC m=+1329.676461132" observedRunningTime="2026-03-08 05:48:23.736966993 +0000 UTC m=+1330.654615837" watchObservedRunningTime="2026-03-08 05:48:23.758164505 +0000 UTC m=+1330.675813349" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.771878 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78c779987c-vpp7g" podStartSLOduration=6.771859623 podStartE2EDuration="6.771859623s" podCreationTimestamp="2026-03-08 05:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:23.770052688 +0000 UTC m=+1330.687701532" watchObservedRunningTime="2026-03-08 05:48:23.771859623 +0000 UTC m=+1330.689508477" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.801170 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.889665 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca\") pod \"62b1d992-0078-406b-ade7-6710e9a62c96\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.889916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plglf\" (UniqueName: \"kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf\") pod \"62b1d992-0078-406b-ade7-6710e9a62c96\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.890458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle\") pod \"62b1d992-0078-406b-ade7-6710e9a62c96\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.890511 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data\") pod \"62b1d992-0078-406b-ade7-6710e9a62c96\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.890546 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs\") pod \"62b1d992-0078-406b-ade7-6710e9a62c96\" (UID: \"62b1d992-0078-406b-ade7-6710e9a62c96\") " Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.893004 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs" (OuterVolumeSpecName: "logs") pod "62b1d992-0078-406b-ade7-6710e9a62c96" (UID: "62b1d992-0078-406b-ade7-6710e9a62c96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.897139 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf" (OuterVolumeSpecName: "kube-api-access-plglf") pod "62b1d992-0078-406b-ade7-6710e9a62c96" (UID: "62b1d992-0078-406b-ade7-6710e9a62c96"). InnerVolumeSpecName "kube-api-access-plglf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.932227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62b1d992-0078-406b-ade7-6710e9a62c96" (UID: "62b1d992-0078-406b-ade7-6710e9a62c96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.941281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "62b1d992-0078-406b-ade7-6710e9a62c96" (UID: "62b1d992-0078-406b-ade7-6710e9a62c96"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.951432 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data" (OuterVolumeSpecName: "config-data") pod "62b1d992-0078-406b-ade7-6710e9a62c96" (UID: "62b1d992-0078-406b-ade7-6710e9a62c96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.994573 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plglf\" (UniqueName: \"kubernetes.io/projected/62b1d992-0078-406b-ade7-6710e9a62c96-kube-api-access-plglf\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.994649 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.994659 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.994716 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b1d992-0078-406b-ade7-6710e9a62c96-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:23 crc kubenswrapper[4717]: I0308 05:48:23.994728 4717 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/62b1d992-0078-406b-ade7-6710e9a62c96-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.015075 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.015134 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.045389 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.056334 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.126774 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5799fc9f64-fmph6"] Mar 08 05:48:24 crc kubenswrapper[4717]: E0308 05:48:24.127127 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac4bbfc-288a-451e-8f03-864b4b2cb96e" containerName="keystone-bootstrap" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127143 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac4bbfc-288a-451e-8f03-864b4b2cb96e" containerName="keystone-bootstrap" Mar 08 05:48:24 crc kubenswrapper[4717]: E0308 05:48:24.127155 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api-log" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127161 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api-log" Mar 08 05:48:24 crc kubenswrapper[4717]: E0308 05:48:24.127186 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127193 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127339 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api-log" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127357 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127366 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac4bbfc-288a-451e-8f03-864b4b2cb96e" containerName="keystone-bootstrap" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.127934 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.130945 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.131289 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.131532 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.131566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.131669 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z9bhb" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.131795 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.151311 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5799fc9f64-fmph6"] Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrvlt\" (UniqueName: \"kubernetes.io/projected/4270902e-1721-4286-be1f-baadb9dc68c1-kube-api-access-mrvlt\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-credential-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-combined-ca-bundle\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-scripts\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198849 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-public-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198864 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-fernet-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.198960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-config-data\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.199043 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-internal-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-credential-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-combined-ca-bundle\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-scripts\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-public-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300836 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-fernet-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-config-data\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.300992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-internal-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.301022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrvlt\" (UniqueName: \"kubernetes.io/projected/4270902e-1721-4286-be1f-baadb9dc68c1-kube-api-access-mrvlt\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.305950 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-public-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.308016 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-config-data\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.312018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-scripts\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.312272 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-internal-tls-certs\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.312565 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-combined-ca-bundle\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.312644 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-credential-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.317525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4270902e-1721-4286-be1f-baadb9dc68c1-fernet-keys\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.323159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrvlt\" (UniqueName: \"kubernetes.io/projected/4270902e-1721-4286-be1f-baadb9dc68c1-kube-api-access-mrvlt\") pod \"keystone-5799fc9f64-fmph6\" (UID: \"4270902e-1721-4286-be1f-baadb9dc68c1\") " pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.442918 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.739873 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"62b1d992-0078-406b-ade7-6710e9a62c96","Type":"ContainerDied","Data":"4f45f9c20ff0b17f2f862b411f18740f87fa5ba961d64c443a5261047bc799be"} Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.740265 4717 scope.go:117] "RemoveContainer" containerID="0d73df4708da23d730f3975f3149f1ab5449833fb5c2db88e445f08159ce201d" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.740484 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.746252 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.746290 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.791473 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.806740 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.816182 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.817606 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.831205 4717 scope.go:117] "RemoveContainer" containerID="1d9b97bcd77096f12879669bbd083a51cf198a9747f27cd3730189e502557834" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.831589 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.832043 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.832232 4717 scope.go:117] "RemoveContainer" containerID="67fa1baa53e060f43bc4bef1c40e9e9a00db4675f8154bea28df2ea9f4b62c84" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.832294 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.844800 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.883851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927677 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927759 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzdt\" (UniqueName: \"kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927794 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927895 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927933 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:24 crc kubenswrapper[4717]: I0308 05:48:24.927966 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.001864 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.002236 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="dnsmasq-dns" containerID="cri-o://03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90" gracePeriod=10 Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.015541 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5799fc9f64-fmph6"] Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.031111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.031184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzdt\" (UniqueName: \"kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.031213 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.031245 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.031295 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.032247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.033542 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.033644 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.034846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.039875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.045856 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.051546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzdt\" (UniqueName: \"kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.057100 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.061819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.148524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.743481 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.753866 4717 generic.go:334] "Generic (PLEG): container finished" podID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerID="03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90" exitCode=0 Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.753917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" event={"ID":"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10","Type":"ContainerDied","Data":"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90"} Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.753940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" event={"ID":"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10","Type":"ContainerDied","Data":"87d29f1eb8804f06c6632b306457130ee09bacc49454aa5b95367a2b5d1f03f1"} Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.753956 4717 scope.go:117] "RemoveContainer" containerID="03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.754032 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5555fccc9f-gxdtl" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.762912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5799fc9f64-fmph6" event={"ID":"4270902e-1721-4286-be1f-baadb9dc68c1","Type":"ContainerStarted","Data":"2b6495c7cf1a02a79114896fee8f33b258c64e5a17c870e278aa2602c8f78698"} Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.762947 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5799fc9f64-fmph6" event={"ID":"4270902e-1721-4286-be1f-baadb9dc68c1","Type":"ContainerStarted","Data":"dfff28b12106cdae260e087a1af342420524e6870dd961619874508f57c1d2f6"} Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.762961 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.791486 4717 scope.go:117] "RemoveContainer" containerID="4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.803622 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5799fc9f64-fmph6" podStartSLOduration=1.803595102 podStartE2EDuration="1.803595102s" podCreationTimestamp="2026-03-08 05:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:25.791955025 +0000 UTC m=+1332.709603869" watchObservedRunningTime="2026-03-08 05:48:25.803595102 +0000 UTC m=+1332.721243946" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.806050 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" path="/var/lib/kubelet/pods/62b1d992-0078-406b-ade7-6710e9a62c96/volumes" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.819887 4717 scope.go:117] "RemoveContainer" containerID="03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90" Mar 08 05:48:25 crc kubenswrapper[4717]: E0308 05:48:25.844877 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90\": container with ID starting with 03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90 not found: ID does not exist" containerID="03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.844929 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90"} err="failed to get container status \"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90\": rpc error: code = NotFound desc = could not find container \"03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90\": container with ID starting with 03bed3ec0c446e34224e514f8a2d82b743a4711407d4fcabe016b396a417bc90 not found: ID does not exist" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.844955 4717 scope.go:117] "RemoveContainer" containerID="4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb" Mar 08 05:48:25 crc kubenswrapper[4717]: E0308 05:48:25.847511 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb\": container with ID starting with 4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb not found: ID does not exist" containerID="4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.847565 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb"} err="failed to get container status \"4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb\": rpc error: code = NotFound desc = could not find container \"4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb\": container with ID starting with 4bdcb3e4cde4397814b68d7197a211ab2076dcbd8ce1b67651e0cfe67efa28fb not found: ID does not exist" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.864840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.864902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.864923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.865042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.865086 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kdx6\" (UniqueName: \"kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.865167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.871351 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.922359 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6" (OuterVolumeSpecName: "kube-api-access-5kdx6") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "kube-api-access-5kdx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.970782 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.970967 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") pod \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\" (UID: \"6e8f63aa-0f92-405e-ac59-9c4ddb14ff10\") " Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.971561 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kdx6\" (UniqueName: \"kubernetes.io/projected/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-kube-api-access-5kdx6\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:25 crc kubenswrapper[4717]: W0308 05:48:25.972327 4717 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10/volumes/kubernetes.io~configmap/dns-svc Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.972347 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.980528 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.988182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:25 crc kubenswrapper[4717]: I0308 05:48:25.988548 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config" (OuterVolumeSpecName: "config") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.019364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" (UID: "6e8f63aa-0f92-405e-ac59-9c4ddb14ff10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.073929 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.073975 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.073997 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.074014 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.074034 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.120065 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.138610 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5555fccc9f-gxdtl"] Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.810856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerStarted","Data":"79e11794d337c00b54012f0999785dd3251bf62880c9a5bf907f504347aa5dbd"} Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.811239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerStarted","Data":"f13035681b8f54a7e25632c6689dfd30c854f5f2a6aa27e142ef2bb93a7ef77b"} Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.811254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerStarted","Data":"0033d2c1dfe31597618634c81450476366fe350da3becf8f7ff928b0f9390d41"} Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.811295 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.818031 4717 generic.go:334] "Generic (PLEG): container finished" podID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerID="cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684" exitCode=1 Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.818113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerDied","Data":"cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684"} Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.818159 4717 scope.go:117] "RemoveContainer" containerID="425e9f3a695ff87478add2c75e6830e7d77823e6940c41a52102c0847b13e48b" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.818618 4717 scope.go:117] "RemoveContainer" containerID="cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684" Mar 08 05:48:26 crc kubenswrapper[4717]: E0308 05:48:26.818896 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:26 crc kubenswrapper[4717]: I0308 05:48:26.837292 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.837271814 podStartE2EDuration="2.837271814s" podCreationTimestamp="2026-03-08 05:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:26.830325233 +0000 UTC m=+1333.747974077" watchObservedRunningTime="2026-03-08 05:48:26.837271814 +0000 UTC m=+1333.754920658" Mar 08 05:48:27 crc kubenswrapper[4717]: I0308 05:48:27.792320 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" path="/var/lib/kubelet/pods/6e8f63aa-0f92-405e-ac59-9c4ddb14ff10/volumes" Mar 08 05:48:27 crc kubenswrapper[4717]: I0308 05:48:27.833282 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66q5h" event={"ID":"ec6c6686-44c7-49ec-950b-7054d96e207d","Type":"ContainerStarted","Data":"23e1e15c00e1ecd322c7ffaa7bb7932a2bd02737f81ba3236f83f3d8be2efd7e"} Mar 08 05:48:27 crc kubenswrapper[4717]: I0308 05:48:27.834917 4717 generic.go:334] "Generic (PLEG): container finished" podID="6c734bf7-1916-4a47-93e0-42caaaced812" containerID="6d863e6acd09b337b0c38622de8d4fcc696d4705ecc9d106fceb5a7e9490d0c4" exitCode=0 Mar 08 05:48:27 crc kubenswrapper[4717]: I0308 05:48:27.834982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rlvrs" event={"ID":"6c734bf7-1916-4a47-93e0-42caaaced812","Type":"ContainerDied","Data":"6d863e6acd09b337b0c38622de8d4fcc696d4705ecc9d106fceb5a7e9490d0c4"} Mar 08 05:48:27 crc kubenswrapper[4717]: I0308 05:48:27.856309 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-66q5h" podStartSLOduration=3.18524422 podStartE2EDuration="47.856287036s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="2026-03-08 05:47:42.325199692 +0000 UTC m=+1289.242848536" lastFinishedPulling="2026-03-08 05:48:26.996242508 +0000 UTC m=+1333.913891352" observedRunningTime="2026-03-08 05:48:27.847984101 +0000 UTC m=+1334.765632945" watchObservedRunningTime="2026-03-08 05:48:27.856287036 +0000 UTC m=+1334.773935880" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.296868 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.297217 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.300439 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.480200 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.480248 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.480884 4717 scope.go:117] "RemoveContainer" containerID="cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684" Mar 08 05:48:28 crc kubenswrapper[4717]: E0308 05:48:28.481089 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.606985 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:28 crc kubenswrapper[4717]: I0308 05:48:28.606985 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="62b1d992-0078-406b-ade7-6710e9a62c96" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": dial tcp 10.217.0.169:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:29 crc kubenswrapper[4717]: I0308 05:48:29.650627 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:48:30 crc kubenswrapper[4717]: I0308 05:48:30.149374 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:48:31 crc kubenswrapper[4717]: I0308 05:48:31.913370 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:48:31 crc kubenswrapper[4717]: I0308 05:48:31.951751 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.093349 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.223569 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlqkp\" (UniqueName: \"kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp\") pod \"6c734bf7-1916-4a47-93e0-42caaaced812\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.223667 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle\") pod \"6c734bf7-1916-4a47-93e0-42caaaced812\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.223841 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data\") pod \"6c734bf7-1916-4a47-93e0-42caaaced812\" (UID: \"6c734bf7-1916-4a47-93e0-42caaaced812\") " Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.228092 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp" (OuterVolumeSpecName: "kube-api-access-nlqkp") pod "6c734bf7-1916-4a47-93e0-42caaaced812" (UID: "6c734bf7-1916-4a47-93e0-42caaaced812"). InnerVolumeSpecName "kube-api-access-nlqkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.235486 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6c734bf7-1916-4a47-93e0-42caaaced812" (UID: "6c734bf7-1916-4a47-93e0-42caaaced812"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.263368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c734bf7-1916-4a47-93e0-42caaaced812" (UID: "6c734bf7-1916-4a47-93e0-42caaaced812"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.327137 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlqkp\" (UniqueName: \"kubernetes.io/projected/6c734bf7-1916-4a47-93e0-42caaaced812-kube-api-access-nlqkp\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.327187 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.327209 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c734bf7-1916-4a47-93e0-42caaaced812-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.744148 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-69bcb664dd-nb94m" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.748791 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.865380 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.904065 4717 generic.go:334] "Generic (PLEG): container finished" podID="ec6c6686-44c7-49ec-950b-7054d96e207d" containerID="23e1e15c00e1ecd322c7ffaa7bb7932a2bd02737f81ba3236f83f3d8be2efd7e" exitCode=0 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.904119 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66q5h" event={"ID":"ec6c6686-44c7-49ec-950b-7054d96e207d","Type":"ContainerDied","Data":"23e1e15c00e1ecd322c7ffaa7bb7932a2bd02737f81ba3236f83f3d8be2efd7e"} Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.906451 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rlvrs" event={"ID":"6c734bf7-1916-4a47-93e0-42caaaced812","Type":"ContainerDied","Data":"cfac0f7783e1d7179f3cacfa158c065d54546d7a248bca6a00e3edd10d38e709"} Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.906477 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfac0f7783e1d7179f3cacfa158c065d54546d7a248bca6a00e3edd10d38e709" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.906537 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rlvrs" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.922295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerStarted","Data":"61ce5c0deb4b32b6586eb00f56b8e91449469492b9cba140c467eadd3a213697"} Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.922413 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cf759c7cb-qxb65" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon-log" containerID="cri-o://35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.922507 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cf759c7cb-qxb65" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" containerID="cri-o://a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.925026 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-central-agent" containerID="cri-o://56d6adbe1a47bc739f60662d8c5fcff494f54d3c7e7935ce9d54cdbdb0582939" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.925103 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="sg-core" containerID="cri-o://51a17d9acbc55b1122dcd14627309e3fda8cdc98435b465ccbc287efb5d05b49" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.925064 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.925202 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="proxy-httpd" containerID="cri-o://61ce5c0deb4b32b6586eb00f56b8e91449469492b9cba140c467eadd3a213697" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.925220 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-notification-agent" containerID="cri-o://20143731b1089e4e5df5f39c5474d218cc338bc7f42a3ac3401f869d4dfd0375" gracePeriod=30 Mar 08 05:48:33 crc kubenswrapper[4717]: I0308 05:48:33.961551 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.251271366 podStartE2EDuration="53.961532908s" podCreationTimestamp="2026-03-08 05:47:40 +0000 UTC" firstStartedPulling="2026-03-08 05:47:42.370059957 +0000 UTC m=+1289.287708791" lastFinishedPulling="2026-03-08 05:48:33.080321459 +0000 UTC m=+1339.997970333" observedRunningTime="2026-03-08 05:48:33.958846612 +0000 UTC m=+1340.876495456" watchObservedRunningTime="2026-03-08 05:48:33.961532908 +0000 UTC m=+1340.879181752" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.120091 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.120396 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.354517 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f775c4c7-brpx4"] Mar 08 05:48:34 crc kubenswrapper[4717]: E0308 05:48:34.354954 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="dnsmasq-dns" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.354971 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="dnsmasq-dns" Mar 08 05:48:34 crc kubenswrapper[4717]: E0308 05:48:34.354990 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" containerName="barbican-db-sync" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.354997 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" containerName="barbican-db-sync" Mar 08 05:48:34 crc kubenswrapper[4717]: E0308 05:48:34.355012 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="init" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.355017 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="init" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.355176 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8f63aa-0f92-405e-ac59-9c4ddb14ff10" containerName="dnsmasq-dns" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.355194 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" containerName="barbican-db-sync" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.356199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.362214 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-79d58bf7f8-bq7ms"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.362883 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-swvz4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.363804 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.371288 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.371406 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.371617 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.379235 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f775c4c7-brpx4"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.405130 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79d58bf7f8-bq7ms"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.422617 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.424134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.456896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.462637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bf9dbc-ec82-4659-92fe-509f95574ef3-logs\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-combined-ca-bundle\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464817 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxxw\" (UniqueName: \"kubernetes.io/projected/d7bf9dbc-ec82-4659-92fe-509f95574ef3-kube-api-access-plxxw\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464863 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4n7f\" (UniqueName: \"kubernetes.io/projected/47805c28-c90d-4882-a0ed-5e531fb545b4-kube-api-access-q4n7f\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464964 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47805c28-c90d-4882-a0ed-5e531fb545b4-logs\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.464983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data-custom\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.465002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data-custom\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.465058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-combined-ca-bundle\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.518369 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.519790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.525469 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.540267 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47805c28-c90d-4882-a0ed-5e531fb545b4-logs\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data-custom\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568766 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data-custom\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568802 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-combined-ca-bundle\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568897 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bf9dbc-ec82-4659-92fe-509f95574ef3-logs\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.568974 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk8zb\" (UniqueName: \"kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569025 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-combined-ca-bundle\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569077 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plxxw\" (UniqueName: \"kubernetes.io/projected/d7bf9dbc-ec82-4659-92fe-509f95574ef3-kube-api-access-plxxw\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794zc\" (UniqueName: \"kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569122 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4n7f\" (UniqueName: \"kubernetes.io/projected/47805c28-c90d-4882-a0ed-5e531fb545b4-kube-api-access-q4n7f\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569165 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bf9dbc-ec82-4659-92fe-509f95574ef3-logs\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.569661 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47805c28-c90d-4882-a0ed-5e531fb545b4-logs\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.576059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data-custom\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.576858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.581042 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-config-data-custom\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.585150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-config-data\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.592523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47805c28-c90d-4882-a0ed-5e531fb545b4-combined-ca-bundle\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.592793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf9dbc-ec82-4659-92fe-509f95574ef3-combined-ca-bundle\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.598942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plxxw\" (UniqueName: \"kubernetes.io/projected/d7bf9dbc-ec82-4659-92fe-509f95574ef3-kube-api-access-plxxw\") pod \"barbican-worker-f775c4c7-brpx4\" (UID: \"d7bf9dbc-ec82-4659-92fe-509f95574ef3\") " pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.599194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4n7f\" (UniqueName: \"kubernetes.io/projected/47805c28-c90d-4882-a0ed-5e531fb545b4-kube-api-access-q4n7f\") pod \"barbican-keystone-listener-79d58bf7f8-bq7ms\" (UID: \"47805c28-c90d-4882-a0ed-5e531fb545b4\") " pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671418 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk8zb\" (UniqueName: \"kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671488 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-794zc\" (UniqueName: \"kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671519 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.671654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.672399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.672758 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.673127 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.673263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.673802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.674262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.675139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.675751 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.677029 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.690791 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-794zc\" (UniqueName: \"kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc\") pod \"dnsmasq-dns-5ddb99d68c-jzg85\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.696023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk8zb\" (UniqueName: \"kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb\") pod \"barbican-api-58f95cdfb-8kctw\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.710337 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f775c4c7-brpx4" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.721367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.762376 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.840447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957511 4717 generic.go:334] "Generic (PLEG): container finished" podID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerID="61ce5c0deb4b32b6586eb00f56b8e91449469492b9cba140c467eadd3a213697" exitCode=0 Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957543 4717 generic.go:334] "Generic (PLEG): container finished" podID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerID="51a17d9acbc55b1122dcd14627309e3fda8cdc98435b465ccbc287efb5d05b49" exitCode=2 Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957553 4717 generic.go:334] "Generic (PLEG): container finished" podID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerID="56d6adbe1a47bc739f60662d8c5fcff494f54d3c7e7935ce9d54cdbdb0582939" exitCode=0 Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerDied","Data":"61ce5c0deb4b32b6586eb00f56b8e91449469492b9cba140c467eadd3a213697"} Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957613 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerDied","Data":"51a17d9acbc55b1122dcd14627309e3fda8cdc98435b465ccbc287efb5d05b49"} Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.957626 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerDied","Data":"56d6adbe1a47bc739f60662d8c5fcff494f54d3c7e7935ce9d54cdbdb0582939"} Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.960366 4717 generic.go:334] "Generic (PLEG): container finished" podID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerID="a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67" exitCode=0 Mar 08 05:48:34 crc kubenswrapper[4717]: I0308 05:48:34.960402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerDied","Data":"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67"} Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.150066 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.159607 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.235608 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79d58bf7f8-bq7ms"] Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.278459 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f775c4c7-brpx4"] Mar 08 05:48:35 crc kubenswrapper[4717]: W0308 05:48:35.400862 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf03fa5b6_b0dc_4284_8727_1205e37f854f.slice/crio-26f6021acd934e3ce8b51ad9f121059a2bbd05137ac6384832e5854d79cfb68f WatchSource:0}: Error finding container 26f6021acd934e3ce8b51ad9f121059a2bbd05137ac6384832e5854d79cfb68f: Status 404 returned error can't find the container with id 26f6021acd934e3ce8b51ad9f121059a2bbd05137ac6384832e5854d79cfb68f Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.400995 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.418977 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66q5h" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491559 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwzn7\" (UniqueName: \"kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491854 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.491878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data\") pod \"ec6c6686-44c7-49ec-950b-7054d96e207d\" (UID: \"ec6c6686-44c7-49ec-950b-7054d96e207d\") " Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.495517 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7" (OuterVolumeSpecName: "kube-api-access-fwzn7") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "kube-api-access-fwzn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.495744 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.497731 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.500164 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.501834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts" (OuterVolumeSpecName: "scripts") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.521493 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.566896 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data" (OuterVolumeSpecName: "config-data") pod "ec6c6686-44c7-49ec-950b-7054d96e207d" (UID: "ec6c6686-44c7-49ec-950b-7054d96e207d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596656 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwzn7\" (UniqueName: \"kubernetes.io/projected/ec6c6686-44c7-49ec-950b-7054d96e207d-kube-api-access-fwzn7\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596694 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596705 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596715 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596723 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec6c6686-44c7-49ec-950b-7054d96e207d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.596731 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6c6686-44c7-49ec-950b-7054d96e207d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.993250 4717 generic.go:334] "Generic (PLEG): container finished" podID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerID="20143731b1089e4e5df5f39c5474d218cc338bc7f42a3ac3401f869d4dfd0375" exitCode=0 Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.993323 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerDied","Data":"20143731b1089e4e5df5f39c5474d218cc338bc7f42a3ac3401f869d4dfd0375"} Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.994925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" event={"ID":"47805c28-c90d-4882-a0ed-5e531fb545b4","Type":"ContainerStarted","Data":"5cdf200e16241d927d437a0cbe4619df8ca1d4dbb9f639b0637b551275855f26"} Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.996476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66q5h" event={"ID":"ec6c6686-44c7-49ec-950b-7054d96e207d","Type":"ContainerDied","Data":"6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf"} Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.996495 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cee01344bf2ecd2ba2c16b4842bb76ff0530ad5e1c9ae19c84ebc6d9796f0cf" Mar 08 05:48:35 crc kubenswrapper[4717]: I0308 05:48:35.996546 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66q5h" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:35.999148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerStarted","Data":"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.002779 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.002793 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerStarted","Data":"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.002804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.002814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerStarted","Data":"ccc4ca52f584495eb6a846fea4de4f67878aa7299a5ccb3479dbed60beb40ebf"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.018189 4717 generic.go:334] "Generic (PLEG): container finished" podID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerID="d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c" exitCode=0 Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.018250 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" event={"ID":"f03fa5b6-b0dc-4284-8727-1205e37f854f","Type":"ContainerDied","Data":"d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.018274 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" event={"ID":"f03fa5b6-b0dc-4284-8727-1205e37f854f","Type":"ContainerStarted","Data":"26f6021acd934e3ce8b51ad9f121059a2bbd05137ac6384832e5854d79cfb68f"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.030291 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58f95cdfb-8kctw" podStartSLOduration=2.030270288 podStartE2EDuration="2.030270288s" podCreationTimestamp="2026-03-08 05:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:36.018761264 +0000 UTC m=+1342.936410108" watchObservedRunningTime="2026-03-08 05:48:36.030270288 +0000 UTC m=+1342.947919132" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.043463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f775c4c7-brpx4" event={"ID":"d7bf9dbc-ec82-4659-92fe-509f95574ef3","Type":"ContainerStarted","Data":"ede82e8dab1eba27786acb1540009af7fe86ffed4687d3fb4cf27d878bdf0f30"} Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.055929 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.243418 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:36 crc kubenswrapper[4717]: E0308 05:48:36.244057 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" containerName="cinder-db-sync" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.244071 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" containerName="cinder-db-sync" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.244231 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" containerName="cinder-db-sync" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.256317 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.265635 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qhn9b" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.265822 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.266431 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.266545 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.274784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.308916 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.311331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.311521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.311659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.311770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlq6\" (UniqueName: \"kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.311908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.312067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.368321 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.377111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.404389 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.424621 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.424658 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.425880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.425917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlq6\" (UniqueName: \"kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.425946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.425981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.426090 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.440140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.442827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.443371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.459865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.498152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlq6\" (UniqueName: \"kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6\") pod \"cinder-scheduler-0\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.531854 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.532155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.532245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgfb\" (UniqueName: \"kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.532332 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.532423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.532490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.554845 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.592607 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.600070 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.610523 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.640993 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643363 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643473 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgfb\" (UniqueName: \"kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643522 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643576 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643598 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643649 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.643813 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.644827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.645222 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.645642 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.649135 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.650039 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.719587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgfb\" (UniqueName: \"kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb\") pod \"dnsmasq-dns-6fdf58bb7c-v2hj5\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.739814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751259 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751295 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751584 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751610 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.751736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.755804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.760225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.760371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.760549 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.761159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.773917 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr\") pod \"cinder-api-0\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " pod="openstack/cinder-api-0" Mar 08 05:48:36 crc kubenswrapper[4717]: I0308 05:48:36.921876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.076750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fba18a-af8c-449a-be74-e2ad6438afa0","Type":"ContainerDied","Data":"ac6eb310008bc6a20628caff5ecb63d45a26cf6dfa1f6415c1c705f250ce1691"} Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.076783 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac6eb310008bc6a20628caff5ecb63d45a26cf6dfa1f6415c1c705f250ce1691" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.129822 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157400 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157567 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157616 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157677 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbszg\" (UniqueName: \"kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157823 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.157853 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle\") pod \"76fba18a-af8c-449a-be74-e2ad6438afa0\" (UID: \"76fba18a-af8c-449a-be74-e2ad6438afa0\") " Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.159558 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.159664 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.161855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts" (OuterVolumeSpecName: "scripts") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.168929 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg" (OuterVolumeSpecName: "kube-api-access-jbszg") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "kube-api-access-jbszg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.193141 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.243639 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259752 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259776 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259786 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbszg\" (UniqueName: \"kubernetes.io/projected/76fba18a-af8c-449a-be74-e2ad6438afa0-kube-api-access-jbszg\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259795 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fba18a-af8c-449a-be74-e2ad6438afa0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259803 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.259811 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.283977 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data" (OuterVolumeSpecName: "config-data") pod "76fba18a-af8c-449a-be74-e2ad6438afa0" (UID: "76fba18a-af8c-449a-be74-e2ad6438afa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.361409 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fba18a-af8c-449a-be74-e2ad6438afa0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.967065 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:37 crc kubenswrapper[4717]: W0308 05:48:37.969751 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5996343f_e6ce_45a1_9a39_06c37968978e.slice/crio-6caee6889202a29f49f730d377850179a7f7e97bd7addc77684887fbeb9e0e5f WatchSource:0}: Error finding container 6caee6889202a29f49f730d377850179a7f7e97bd7addc77684887fbeb9e0e5f: Status 404 returned error can't find the container with id 6caee6889202a29f49f730d377850179a7f7e97bd7addc77684887fbeb9e0e5f Mar 08 05:48:37 crc kubenswrapper[4717]: I0308 05:48:37.986486 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.089580 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerStarted","Data":"a5193aeac83ac94cba0be17699bcce6dd7e635178712d957237227c3e99a8ba0"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.091344 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerStarted","Data":"6caee6889202a29f49f730d377850179a7f7e97bd7addc77684887fbeb9e0e5f"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.106664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" event={"ID":"47805c28-c90d-4882-a0ed-5e531fb545b4","Type":"ContainerStarted","Data":"4cbbe4d32a30500f558376e6f61cb2a5a51371e73147d1889ebe9e4c4070f3d6"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.106723 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" event={"ID":"47805c28-c90d-4882-a0ed-5e531fb545b4","Type":"ContainerStarted","Data":"8e0ff896f3e8ab872a030866a91466b7c48da6a9c14af0bb96d72da5024cc0bb"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.112809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" event={"ID":"f03fa5b6-b0dc-4284-8727-1205e37f854f","Type":"ContainerStarted","Data":"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.112887 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="dnsmasq-dns" containerID="cri-o://38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57" gracePeriod=10 Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.113031 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.122830 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.124130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f775c4c7-brpx4" event={"ID":"d7bf9dbc-ec82-4659-92fe-509f95574ef3","Type":"ContainerStarted","Data":"3786b205fc20b054828867391a6ebcd09efe83fc18a5f42a8d0aac0d80635485"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.124206 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f775c4c7-brpx4" event={"ID":"d7bf9dbc-ec82-4659-92fe-509f95574ef3","Type":"ContainerStarted","Data":"1e1ddc1427b4613ccef7680594dcf1b57ad490af720b8d316207511b3c9239f0"} Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.132487 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.161039 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-79d58bf7f8-bq7ms" podStartSLOduration=1.9115237440000001 podStartE2EDuration="4.161020634s" podCreationTimestamp="2026-03-08 05:48:34 +0000 UTC" firstStartedPulling="2026-03-08 05:48:35.261551649 +0000 UTC m=+1342.179200493" lastFinishedPulling="2026-03-08 05:48:37.511048539 +0000 UTC m=+1344.428697383" observedRunningTime="2026-03-08 05:48:38.150036873 +0000 UTC m=+1345.067685727" watchObservedRunningTime="2026-03-08 05:48:38.161020634 +0000 UTC m=+1345.078669478" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.179889 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" podStartSLOduration=4.179874448 podStartE2EDuration="4.179874448s" podCreationTimestamp="2026-03-08 05:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:38.179635582 +0000 UTC m=+1345.097284426" watchObservedRunningTime="2026-03-08 05:48:38.179874448 +0000 UTC m=+1345.097523292" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.209585 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f775c4c7-brpx4" podStartSLOduration=1.941155203 podStartE2EDuration="4.209568809s" podCreationTimestamp="2026-03-08 05:48:34 +0000 UTC" firstStartedPulling="2026-03-08 05:48:35.265170718 +0000 UTC m=+1342.182819562" lastFinishedPulling="2026-03-08 05:48:37.533584324 +0000 UTC m=+1344.451233168" observedRunningTime="2026-03-08 05:48:38.205974081 +0000 UTC m=+1345.123622925" watchObservedRunningTime="2026-03-08 05:48:38.209568809 +0000 UTC m=+1345.127217653" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.341478 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.350856 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.371282 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:48:38 crc kubenswrapper[4717]: E0308 05:48:38.371731 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-notification-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.371749 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-notification-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: E0308 05:48:38.371767 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="proxy-httpd" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.371773 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="proxy-httpd" Mar 08 05:48:38 crc kubenswrapper[4717]: E0308 05:48:38.371799 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-central-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.371808 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-central-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: E0308 05:48:38.371821 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="sg-core" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.371827 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="sg-core" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.372012 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-central-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.372029 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="proxy-httpd" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.372041 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="ceilometer-notification-agent" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.372050 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" containerName="sg-core" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.373769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.379554 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.380435 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.380912 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.483676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484137 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484154 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kc99\" (UniqueName: \"kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.484245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.585931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.585969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.585989 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.586010 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kc99\" (UniqueName: \"kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.586067 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.586126 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.586173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.589319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.589371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.590564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.591055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.592871 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.593219 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.605670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kc99\" (UniqueName: \"kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99\") pod \"ceilometer-0\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.701409 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.728285 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.919910 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.920348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.920503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.920675 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-794zc\" (UniqueName: \"kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.920802 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.920834 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:38 crc kubenswrapper[4717]: I0308 05:48:38.929281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc" (OuterVolumeSpecName: "kube-api-access-794zc") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "kube-api-access-794zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:38.997356 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.009509 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config" (OuterVolumeSpecName: "config") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024363 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024562 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") pod \"f03fa5b6-b0dc-4284-8727-1205e37f854f\" (UID: \"f03fa5b6-b0dc-4284-8727-1205e37f854f\") " Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024836 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-794zc\" (UniqueName: \"kubernetes.io/projected/f03fa5b6-b0dc-4284-8727-1205e37f854f-kube-api-access-794zc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024848 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024856 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: W0308 05:48:39.024921 4717 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f03fa5b6-b0dc-4284-8727-1205e37f854f/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.024931 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.036954 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.107955 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f03fa5b6-b0dc-4284-8727-1205e37f854f" (UID: "f03fa5b6-b0dc-4284-8727-1205e37f854f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.129520 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.129551 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.129563 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f03fa5b6-b0dc-4284-8727-1205e37f854f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.154355 4717 generic.go:334] "Generic (PLEG): container finished" podID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerID="38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57" exitCode=0 Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.154412 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" event={"ID":"f03fa5b6-b0dc-4284-8727-1205e37f854f","Type":"ContainerDied","Data":"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57"} Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.154445 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" event={"ID":"f03fa5b6-b0dc-4284-8727-1205e37f854f","Type":"ContainerDied","Data":"26f6021acd934e3ce8b51ad9f121059a2bbd05137ac6384832e5854d79cfb68f"} Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.154463 4717 scope.go:117] "RemoveContainer" containerID="38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.154567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddb99d68c-jzg85" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.172678 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerStarted","Data":"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99"} Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.179892 4717 generic.go:334] "Generic (PLEG): container finished" podID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerID="dd0523f63c13e73265160658775fc47c36a80a21d806c65e934dd2569a3f8017" exitCode=0 Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.179974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" event={"ID":"41e72e63-490d-46aa-b4ff-68e33f7def1c","Type":"ContainerDied","Data":"dd0523f63c13e73265160658775fc47c36a80a21d806c65e934dd2569a3f8017"} Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.181228 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" event={"ID":"41e72e63-490d-46aa-b4ff-68e33f7def1c","Type":"ContainerStarted","Data":"9d2db1dd4933d5a3d32598c00abe59f311debbcd4e7f2aca8d2a7349d70a403b"} Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.186603 4717 scope.go:117] "RemoveContainer" containerID="d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.206758 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.233889 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.249373 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddb99d68c-jzg85"] Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.324947 4717 scope.go:117] "RemoveContainer" containerID="38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57" Mar 08 05:48:39 crc kubenswrapper[4717]: E0308 05:48:39.325761 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57\": container with ID starting with 38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57 not found: ID does not exist" containerID="38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.325804 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57"} err="failed to get container status \"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57\": rpc error: code = NotFound desc = could not find container \"38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57\": container with ID starting with 38c954fc2f531e68b6b5bd09372e3187bab4bb11d8abb1aa81c88954d096ff57 not found: ID does not exist" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.325833 4717 scope.go:117] "RemoveContainer" containerID="d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c" Mar 08 05:48:39 crc kubenswrapper[4717]: E0308 05:48:39.326213 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c\": container with ID starting with d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c not found: ID does not exist" containerID="d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.326267 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c"} err="failed to get container status \"d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c\": rpc error: code = NotFound desc = could not find container \"d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c\": container with ID starting with d61c03f063267081cafa2963a9e48f8ff189ef3b8ddaf53dcf9474a51743e27c not found: ID does not exist" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.781738 4717 scope.go:117] "RemoveContainer" containerID="cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.797021 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76fba18a-af8c-449a-be74-e2ad6438afa0" path="/var/lib/kubelet/pods/76fba18a-af8c-449a-be74-e2ad6438afa0/volumes" Mar 08 05:48:39 crc kubenswrapper[4717]: I0308 05:48:39.797702 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" path="/var/lib/kubelet/pods/f03fa5b6-b0dc-4284-8727-1205e37f854f/volumes" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.103047 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cf759c7cb-qxb65" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.170:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.170:8443: connect: connection refused" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.197007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerStarted","Data":"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.198479 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerStarted","Data":"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.198500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerStarted","Data":"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.202815 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerStarted","Data":"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.203312 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.206201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerStarted","Data":"c4e826da2cc6d7ca8173344c8c10d03a82278d885982fc2c702c7e03e4a551ab"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.206234 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerStarted","Data":"5580a3d494f5319249f010871d7ffb376fccd5e15eb310977c1c5e35ad842a69"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.206246 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerStarted","Data":"db09693a7a76ac4d53b1b532eb993adb342e0de9056a048aeb782655b1274bf4"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.208343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" event={"ID":"41e72e63-490d-46aa-b4ff-68e33f7def1c","Type":"ContainerStarted","Data":"6d17d523fafd41c0a08ce93a9f16c397a98b918f76bb985c91310a55468a5d3f"} Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.209060 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.251420 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" podStartSLOduration=4.251405026 podStartE2EDuration="4.251405026s" podCreationTimestamp="2026-03-08 05:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:40.245615663 +0000 UTC m=+1347.163264507" watchObservedRunningTime="2026-03-08 05:48:40.251405026 +0000 UTC m=+1347.169053870" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.281387 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.8150367320000003 podStartE2EDuration="4.281367534s" podCreationTimestamp="2026-03-08 05:48:36 +0000 UTC" firstStartedPulling="2026-03-08 05:48:37.981060523 +0000 UTC m=+1344.898709367" lastFinishedPulling="2026-03-08 05:48:38.447391325 +0000 UTC m=+1345.365040169" observedRunningTime="2026-03-08 05:48:40.264600681 +0000 UTC m=+1347.182249525" watchObservedRunningTime="2026-03-08 05:48:40.281367534 +0000 UTC m=+1347.199016378" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.284151 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.284141612 podStartE2EDuration="4.284141612s" podCreationTimestamp="2026-03-08 05:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:40.27879534 +0000 UTC m=+1347.196444184" watchObservedRunningTime="2026-03-08 05:48:40.284141612 +0000 UTC m=+1347.201790456" Mar 08 05:48:40 crc kubenswrapper[4717]: I0308 05:48:40.558321 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.171880 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-569c95cff8-l9lj5"] Mar 08 05:48:41 crc kubenswrapper[4717]: E0308 05:48:41.172517 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="dnsmasq-dns" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.172529 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="dnsmasq-dns" Mar 08 05:48:41 crc kubenswrapper[4717]: E0308 05:48:41.172549 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="init" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.172554 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="init" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.172720 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03fa5b6-b0dc-4284-8727-1205e37f854f" containerName="dnsmasq-dns" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.199670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.205883 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.206162 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.219458 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-569c95cff8-l9lj5"] Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.275999 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerStarted","Data":"d55911cc78bf6c0b790d72585f2d54d9b9d7471a9d4549f5b41bcacd858b0470"} Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.371849 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-internal-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f496e4-7d01-447c-9ebb-9da7b333d817-logs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-public-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data-custom\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372235 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttd84\" (UniqueName: \"kubernetes.io/projected/24f496e4-7d01-447c-9ebb-9da7b333d817-kube-api-access-ttd84\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.372320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-combined-ca-bundle\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.473959 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttd84\" (UniqueName: \"kubernetes.io/projected/24f496e4-7d01-447c-9ebb-9da7b333d817-kube-api-access-ttd84\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474086 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-combined-ca-bundle\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-internal-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f496e4-7d01-447c-9ebb-9da7b333d817-logs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-public-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data-custom\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.474757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f496e4-7d01-447c-9ebb-9da7b333d817-logs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.481428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-combined-ca-bundle\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.481930 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data-custom\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.481953 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-public-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.482879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-config-data\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.485993 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24f496e4-7d01-447c-9ebb-9da7b333d817-internal-tls-certs\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.492513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttd84\" (UniqueName: \"kubernetes.io/projected/24f496e4-7d01-447c-9ebb-9da7b333d817-kube-api-access-ttd84\") pod \"barbican-api-569c95cff8-l9lj5\" (UID: \"24f496e4-7d01-447c-9ebb-9da7b333d817\") " pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.554028 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:41 crc kubenswrapper[4717]: I0308 05:48:41.611784 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.056651 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-569c95cff8-l9lj5"] Mar 08 05:48:42 crc kubenswrapper[4717]: W0308 05:48:42.074788 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24f496e4_7d01_447c_9ebb_9da7b333d817.slice/crio-3ae4f3587dfc7f1b57bf6af211c88132704ab5cbf16f179629bab01b82df90ee WatchSource:0}: Error finding container 3ae4f3587dfc7f1b57bf6af211c88132704ab5cbf16f179629bab01b82df90ee: Status 404 returned error can't find the container with id 3ae4f3587dfc7f1b57bf6af211c88132704ab5cbf16f179629bab01b82df90ee Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.288069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569c95cff8-l9lj5" event={"ID":"24f496e4-7d01-447c-9ebb-9da7b333d817","Type":"ContainerStarted","Data":"edf93d408893489fa32d1367b7824b89a1671e656e581109a9f501baf93de7d4"} Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.289042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569c95cff8-l9lj5" event={"ID":"24f496e4-7d01-447c-9ebb-9da7b333d817","Type":"ContainerStarted","Data":"3ae4f3587dfc7f1b57bf6af211c88132704ab5cbf16f179629bab01b82df90ee"} Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.289008 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api-log" containerID="cri-o://6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" gracePeriod=30 Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.289532 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api" containerID="cri-o://5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" gracePeriod=30 Mar 08 05:48:42 crc kubenswrapper[4717]: I0308 05:48:42.917046 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005424 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005637 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.005701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data\") pod \"5996343f-e6ce-45a1-9a39-06c37968978e\" (UID: \"5996343f-e6ce-45a1-9a39-06c37968978e\") " Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.006722 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.007157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs" (OuterVolumeSpecName: "logs") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.011426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts" (OuterVolumeSpecName: "scripts") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.042550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr" (OuterVolumeSpecName: "kube-api-access-4mcmr") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "kube-api-access-4mcmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.043854 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.061426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.092815 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data" (OuterVolumeSpecName: "config-data") pod "5996343f-e6ce-45a1-9a39-06c37968978e" (UID: "5996343f-e6ce-45a1-9a39-06c37968978e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107839 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107873 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5996343f-e6ce-45a1-9a39-06c37968978e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107883 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/5996343f-e6ce-45a1-9a39-06c37968978e-kube-api-access-4mcmr\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107894 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107903 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5996343f-e6ce-45a1-9a39-06c37968978e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107911 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.107920 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5996343f-e6ce-45a1-9a39-06c37968978e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.303771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-569c95cff8-l9lj5" event={"ID":"24f496e4-7d01-447c-9ebb-9da7b333d817","Type":"ContainerStarted","Data":"9bffe15a8a1b9addb7e94c633529d6a8bee6add94824e883792d82fb6c3ef433"} Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.304124 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308296 4717 generic.go:334] "Generic (PLEG): container finished" podID="5996343f-e6ce-45a1-9a39-06c37968978e" containerID="5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" exitCode=0 Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308325 4717 generic.go:334] "Generic (PLEG): container finished" podID="5996343f-e6ce-45a1-9a39-06c37968978e" containerID="6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" exitCode=143 Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerDied","Data":"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108"} Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerDied","Data":"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99"} Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308399 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5996343f-e6ce-45a1-9a39-06c37968978e","Type":"ContainerDied","Data":"6caee6889202a29f49f730d377850179a7f7e97bd7addc77684887fbeb9e0e5f"} Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308412 4717 scope.go:117] "RemoveContainer" containerID="5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.308509 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.321041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerStarted","Data":"285a10d6d26d2c51bd96b1ce9eb81145ab74d2c4266a5c555ccb56be2c673922"} Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.321963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.337792 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-569c95cff8-l9lj5" podStartSLOduration=2.3377733640000002 podStartE2EDuration="2.337773364s" podCreationTimestamp="2026-03-08 05:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:43.322330174 +0000 UTC m=+1350.239979018" watchObservedRunningTime="2026-03-08 05:48:43.337773364 +0000 UTC m=+1350.255422198" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.357065 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.266376766 podStartE2EDuration="5.357044339s" podCreationTimestamp="2026-03-08 05:48:38 +0000 UTC" firstStartedPulling="2026-03-08 05:48:39.274783629 +0000 UTC m=+1346.192432473" lastFinishedPulling="2026-03-08 05:48:42.365451202 +0000 UTC m=+1349.283100046" observedRunningTime="2026-03-08 05:48:43.344351506 +0000 UTC m=+1350.262000360" watchObservedRunningTime="2026-03-08 05:48:43.357044339 +0000 UTC m=+1350.274693193" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.365278 4717 scope.go:117] "RemoveContainer" containerID="6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.374409 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.388781 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394163 4717 scope.go:117] "RemoveContainer" containerID="5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394261 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:43 crc kubenswrapper[4717]: E0308 05:48:43.394603 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394618 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api" Mar 08 05:48:43 crc kubenswrapper[4717]: E0308 05:48:43.394642 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api-log" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394649 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api-log" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394828 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api-log" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.394839 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" containerName="cinder-api" Mar 08 05:48:43 crc kubenswrapper[4717]: E0308 05:48:43.394948 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108\": container with ID starting with 5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108 not found: ID does not exist" containerID="5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.395003 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108"} err="failed to get container status \"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108\": rpc error: code = NotFound desc = could not find container \"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108\": container with ID starting with 5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108 not found: ID does not exist" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.395043 4717 scope.go:117] "RemoveContainer" containerID="6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.395769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: E0308 05:48:43.395864 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99\": container with ID starting with 6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99 not found: ID does not exist" containerID="6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.395909 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99"} err="failed to get container status \"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99\": rpc error: code = NotFound desc = could not find container \"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99\": container with ID starting with 6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99 not found: ID does not exist" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.395933 4717 scope.go:117] "RemoveContainer" containerID="5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.397248 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108"} err="failed to get container status \"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108\": rpc error: code = NotFound desc = could not find container \"5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108\": container with ID starting with 5605807b97d0bf58952075bf9f3f104268bfdc1d18eef8d23c79a3dee79e2108 not found: ID does not exist" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.397275 4717 scope.go:117] "RemoveContainer" containerID="6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.397609 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99"} err="failed to get container status \"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99\": rpc error: code = NotFound desc = could not find container \"6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99\": container with ID starting with 6ae72b171ec5e862cb9e278db1bb5de0e066c5ec0804041295421db5efd9fe99 not found: ID does not exist" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.402123 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.402323 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.403207 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.415066 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.514455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e36763c-a3c1-424c-8982-1af635ee7100-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.514500 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvk4\" (UniqueName: \"kubernetes.io/projected/5e36763c-a3c1-424c-8982-1af635ee7100-kube-api-access-rgvk4\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.514518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-scripts\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.514543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data-custom\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.515547 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.515611 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.515690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.516048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.516081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e36763c-a3c1-424c-8982-1af635ee7100-logs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e36763c-a3c1-424c-8982-1af635ee7100-logs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e36763c-a3c1-424c-8982-1af635ee7100-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618159 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-scripts\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvk4\" (UniqueName: \"kubernetes.io/projected/5e36763c-a3c1-424c-8982-1af635ee7100-kube-api-access-rgvk4\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data-custom\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e36763c-a3c1-424c-8982-1af635ee7100-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618557 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.618791 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e36763c-a3c1-424c-8982-1af635ee7100-logs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.623250 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.630365 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.630564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.630589 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-scripts\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.630762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data-custom\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.631269 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e36763c-a3c1-424c-8982-1af635ee7100-config-data\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.632919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvk4\" (UniqueName: \"kubernetes.io/projected/5e36763c-a3c1-424c-8982-1af635ee7100-kube-api-access-rgvk4\") pod \"cinder-api-0\" (UID: \"5e36763c-a3c1-424c-8982-1af635ee7100\") " pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.725873 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 05:48:43 crc kubenswrapper[4717]: I0308 05:48:43.795367 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5996343f-e6ce-45a1-9a39-06c37968978e" path="/var/lib/kubelet/pods/5996343f-e6ce-45a1-9a39-06c37968978e/volumes" Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.189101 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 05:48:44 crc kubenswrapper[4717]: W0308 05:48:44.191510 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e36763c_a3c1_424c_8982_1af635ee7100.slice/crio-6667f99acc9e3753e4c83101abb5cf3d00f895e6608d41d1956f4bd67b4c4419 WatchSource:0}: Error finding container 6667f99acc9e3753e4c83101abb5cf3d00f895e6608d41d1956f4bd67b4c4419: Status 404 returned error can't find the container with id 6667f99acc9e3753e4c83101abb5cf3d00f895e6608d41d1956f4bd67b4c4419 Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.334799 4717 generic.go:334] "Generic (PLEG): container finished" podID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" exitCode=1 Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.334858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerDied","Data":"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006"} Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.334890 4717 scope.go:117] "RemoveContainer" containerID="cd8dc25e4b6ba32a312e1e13dcb2b1aff222e3306582ef7c5a3989aedebc0684" Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.335501 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:48:44 crc kubenswrapper[4717]: E0308 05:48:44.335750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.348053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5e36763c-a3c1-424c-8982-1af635ee7100","Type":"ContainerStarted","Data":"6667f99acc9e3753e4c83101abb5cf3d00f895e6608d41d1956f4bd67b4c4419"} Mar 08 05:48:44 crc kubenswrapper[4717]: I0308 05:48:44.348417 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.033088 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.286032 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.286477 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78c779987c-vpp7g" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-api" containerID="cri-o://deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce" gracePeriod=30 Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.287135 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78c779987c-vpp7g" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" containerID="cri-o://d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9" gracePeriod=30 Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.307597 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78c779987c-vpp7g" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.179:9696/\": EOF" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.344766 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-665f758875-jsp86"] Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.346325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.358308 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665f758875-jsp86"] Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.359892 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-internal-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.359928 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-public-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.359976 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-httpd-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.360006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-ovndb-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.360026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.360047 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-combined-ca-bundle\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.360119 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgbmc\" (UniqueName: \"kubernetes.io/projected/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-kube-api-access-dgbmc\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.374018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5e36763c-a3c1-424c-8982-1af635ee7100","Type":"ContainerStarted","Data":"0597e0cd144fe86702b42295a7220a1e13b21332158006d0e89954617bce5124"} Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-ovndb-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-combined-ca-bundle\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgbmc\" (UniqueName: \"kubernetes.io/projected/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-kube-api-access-dgbmc\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462847 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-internal-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462868 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-public-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.462954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-httpd-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.467374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.468613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-combined-ca-bundle\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.470321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-httpd-config\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.471083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-ovndb-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.473471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-public-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.473508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-internal-tls-certs\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.485337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgbmc\" (UniqueName: \"kubernetes.io/projected/7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494-kube-api-access-dgbmc\") pod \"neutron-665f758875-jsp86\" (UID: \"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494\") " pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:45 crc kubenswrapper[4717]: I0308 05:48:45.676559 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.296396 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665f758875-jsp86"] Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.408084 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5e36763c-a3c1-424c-8982-1af635ee7100","Type":"ContainerStarted","Data":"1817d0e27be7c04774d62291673bce64c5027aeab1f5576c5e43a1034128626f"} Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.408404 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.409550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f758875-jsp86" event={"ID":"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494","Type":"ContainerStarted","Data":"47d15ef18b1acfcbd0251c582b8d1280362911c56e25d5a2e29ac5a5a2cc4fb5"} Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.416405 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerID="d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9" exitCode=0 Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.416433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerDied","Data":"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9"} Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.456559 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.456537699 podStartE2EDuration="3.456537699s" podCreationTimestamp="2026-03-08 05:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:46.435420149 +0000 UTC m=+1353.353068993" watchObservedRunningTime="2026-03-08 05:48:46.456537699 +0000 UTC m=+1353.374186543" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.634629 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.741429 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.802056 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.802290 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56bd65676f-5n794" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="dnsmasq-dns" containerID="cri-o://b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788" gracePeriod=10 Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.834137 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.845573 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 05:48:46 crc kubenswrapper[4717]: I0308 05:48:46.915425 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.343175 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408122 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fskv\" (UniqueName: \"kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408413 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408449 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.408502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb\") pod \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\" (UID: \"f9b525ae-0a3d-41ba-b961-2e1fecce18b9\") " Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.423655 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv" (OuterVolumeSpecName: "kube-api-access-6fskv") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "kube-api-access-6fskv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.446082 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerID="b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788" exitCode=0 Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.446135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerDied","Data":"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788"} Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.446175 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56bd65676f-5n794" event={"ID":"f9b525ae-0a3d-41ba-b961-2e1fecce18b9","Type":"ContainerDied","Data":"273f4198448dec3e01eec0537e568a2ca2ebcbd25270b5e41e2fa06407a40284"} Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.446193 4717 scope.go:117] "RemoveContainer" containerID="b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.446350 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56bd65676f-5n794" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.453694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f758875-jsp86" event={"ID":"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494","Type":"ContainerStarted","Data":"a6926eeea8830c1f3fe91078bdae7717c4c50d88b965593374661b0e57c7ad53"} Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.453791 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f758875-jsp86" event={"ID":"7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494","Type":"ContainerStarted","Data":"34e47607e315070630c3cc114784d104951ee558e863a3e1a0da98e097fa2f9c"} Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.453891 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="cinder-scheduler" containerID="cri-o://0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d" gracePeriod=30 Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.455388 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="probe" containerID="cri-o://6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a" gracePeriod=30 Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.477891 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-665f758875-jsp86" podStartSLOduration=2.477872177 podStartE2EDuration="2.477872177s" podCreationTimestamp="2026-03-08 05:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:47.477710533 +0000 UTC m=+1354.395359367" watchObservedRunningTime="2026-03-08 05:48:47.477872177 +0000 UTC m=+1354.395521021" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.492924 4717 scope.go:117] "RemoveContainer" containerID="4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.507393 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.507419 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.509030 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.510416 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.510441 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.510452 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.510461 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fskv\" (UniqueName: \"kubernetes.io/projected/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-kube-api-access-6fskv\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.529123 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.550268 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config" (OuterVolumeSpecName: "config") pod "f9b525ae-0a3d-41ba-b961-2e1fecce18b9" (UID: "f9b525ae-0a3d-41ba-b961-2e1fecce18b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.612144 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.612201 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b525ae-0a3d-41ba-b961-2e1fecce18b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.649812 4717 scope.go:117] "RemoveContainer" containerID="b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788" Mar 08 05:48:47 crc kubenswrapper[4717]: E0308 05:48:47.650323 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788\": container with ID starting with b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788 not found: ID does not exist" containerID="b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.650380 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788"} err="failed to get container status \"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788\": rpc error: code = NotFound desc = could not find container \"b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788\": container with ID starting with b0e27c5f6c6d8ce954ad913c1fcf77e9cccdd002cf79f11a6d8be766d4c22788 not found: ID does not exist" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.650409 4717 scope.go:117] "RemoveContainer" containerID="4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a" Mar 08 05:48:47 crc kubenswrapper[4717]: E0308 05:48:47.650822 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a\": container with ID starting with 4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a not found: ID does not exist" containerID="4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.650858 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a"} err="failed to get container status \"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a\": rpc error: code = NotFound desc = could not find container \"4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a\": container with ID starting with 4d6cce48fb7f5d548f9e18ac60147a339ffcdd3ba4633f13f9fa429ad22bb64a not found: ID does not exist" Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.863652 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:47 crc kubenswrapper[4717]: I0308 05:48:47.865841 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56bd65676f-5n794"] Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.277239 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342366 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342507 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342565 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5764\" (UniqueName: \"kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342692 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.342817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs\") pod \"3b21c262-66aa-47df-ad60-24b7a43031a3\" (UID: \"3b21c262-66aa-47df-ad60-24b7a43031a3\") " Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.352165 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764" (OuterVolumeSpecName: "kube-api-access-k5764") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "kube-api-access-k5764". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.376237 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.415621 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config" (OuterVolumeSpecName: "config") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.436932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.443893 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.443917 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.443927 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5764\" (UniqueName: \"kubernetes.io/projected/3b21c262-66aa-47df-ad60-24b7a43031a3-kube-api-access-k5764\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.443937 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.454226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.460753 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.468780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b21c262-66aa-47df-ad60-24b7a43031a3" (UID: "3b21c262-66aa-47df-ad60-24b7a43031a3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.473562 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerID="deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce" exitCode=0 Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.473622 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78c779987c-vpp7g" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.473636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerDied","Data":"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce"} Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.473666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78c779987c-vpp7g" event={"ID":"3b21c262-66aa-47df-ad60-24b7a43031a3","Type":"ContainerDied","Data":"0dd173f4f4c9a480972f322245910443c573d1c7eab06887b5cc282ed79c179d"} Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.473685 4717 scope.go:117] "RemoveContainer" containerID="d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.478347 4717 generic.go:334] "Generic (PLEG): container finished" podID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerID="6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a" exitCode=0 Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.478394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerDied","Data":"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a"} Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.479575 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.480219 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.480461 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.481059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.481085 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.481093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.486833 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-665f758875-jsp86" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.490721 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.491612 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.545390 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.545417 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.545427 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b21c262-66aa-47df-ad60-24b7a43031a3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.570588 4717 scope.go:117] "RemoveContainer" containerID="deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.581519 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.601062 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78c779987c-vpp7g"] Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.617188 4717 scope.go:117] "RemoveContainer" containerID="d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.618299 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9\": container with ID starting with d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9 not found: ID does not exist" containerID="d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.618325 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9"} err="failed to get container status \"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9\": rpc error: code = NotFound desc = could not find container \"d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9\": container with ID starting with d31ab7fcfa17056b28b5d94e1a0378464a4135c18905c2699e64537c224e0fa9 not found: ID does not exist" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.618362 4717 scope.go:117] "RemoveContainer" containerID="deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.618610 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce\": container with ID starting with deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce not found: ID does not exist" containerID="deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.618628 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce"} err="failed to get container status \"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce\": rpc error: code = NotFound desc = could not find container \"deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce\": container with ID starting with deafd14689c146f932d28f3cb7efc8cc849c4894953c227a764f062493bdd7ce not found: ID does not exist" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783145 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6c49bc6878-t8tg8"] Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.783521 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783533 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.783550 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="dnsmasq-dns" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783556 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="dnsmasq-dns" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.783584 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="init" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783590 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="init" Mar 08 05:48:48 crc kubenswrapper[4717]: E0308 05:48:48.783596 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-api" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783601 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-api" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783793 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-api" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783816 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.783838 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" containerName="dnsmasq-dns" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.785621 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.815107 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c49bc6878-t8tg8"] Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgcn\" (UniqueName: \"kubernetes.io/projected/bb4d403c-6eb6-401c-9b4b-734c6adf3828-kube-api-access-4fgcn\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4d403c-6eb6-401c-9b4b-734c6adf3828-logs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-scripts\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-config-data\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863638 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-public-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863664 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-combined-ca-bundle\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.863763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-internal-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.950268 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.964979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgcn\" (UniqueName: \"kubernetes.io/projected/bb4d403c-6eb6-401c-9b4b-734c6adf3828-kube-api-access-4fgcn\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965036 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4d403c-6eb6-401c-9b4b-734c6adf3828-logs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-scripts\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-config-data\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-public-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-combined-ca-bundle\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.965242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-internal-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.967001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4d403c-6eb6-401c-9b4b-734c6adf3828-logs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.970452 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-scripts\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.970571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-internal-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.974181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-config-data\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.975374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-combined-ca-bundle\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:48 crc kubenswrapper[4717]: I0308 05:48:48.976064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4d403c-6eb6-401c-9b4b-734c6adf3828-public-tls-certs\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.000318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgcn\" (UniqueName: \"kubernetes.io/projected/bb4d403c-6eb6-401c-9b4b-734c6adf3828-kube-api-access-4fgcn\") pod \"placement-6c49bc6878-t8tg8\" (UID: \"bb4d403c-6eb6-401c-9b4b-734c6adf3828\") " pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.066075 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.066500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.066729 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.066859 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlq6\" (UniqueName: \"kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.066970 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.067064 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id\") pod \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\" (UID: \"6fffef01-b5cb-4761-afed-d11e3e65ac1f\") " Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.067636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.070315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts" (OuterVolumeSpecName: "scripts") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.070624 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.071715 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6" (OuterVolumeSpecName: "kube-api-access-ghlq6") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "kube-api-access-ghlq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.114228 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.134121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.168804 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.169046 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.169123 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.169195 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghlq6\" (UniqueName: \"kubernetes.io/projected/6fffef01-b5cb-4761-afed-d11e3e65ac1f-kube-api-access-ghlq6\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.169258 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fffef01-b5cb-4761-afed-d11e3e65ac1f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.188839 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data" (OuterVolumeSpecName: "config-data") pod "6fffef01-b5cb-4761-afed-d11e3e65ac1f" (UID: "6fffef01-b5cb-4761-afed-d11e3e65ac1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.271024 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fffef01-b5cb-4761-afed-d11e3e65ac1f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.498553 4717 generic.go:334] "Generic (PLEG): container finished" podID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerID="0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d" exitCode=0 Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.498877 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerDied","Data":"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d"} Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.498906 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6fffef01-b5cb-4761-afed-d11e3e65ac1f","Type":"ContainerDied","Data":"a5193aeac83ac94cba0be17699bcce6dd7e635178712d957237227c3e99a8ba0"} Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.498923 4717 scope.go:117] "RemoveContainer" containerID="6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.499012 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.506066 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:48:49 crc kubenswrapper[4717]: E0308 05:48:49.506417 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.559611 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.567568 4717 scope.go:117] "RemoveContainer" containerID="0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.576525 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.588760 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:49 crc kubenswrapper[4717]: E0308 05:48:49.589195 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="probe" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.589213 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="probe" Mar 08 05:48:49 crc kubenswrapper[4717]: E0308 05:48:49.589249 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="cinder-scheduler" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.589261 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="cinder-scheduler" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.589428 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="cinder-scheduler" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.589458 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" containerName="probe" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.590462 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.593943 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.597551 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.609972 4717 scope.go:117] "RemoveContainer" containerID="6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a" Mar 08 05:48:49 crc kubenswrapper[4717]: E0308 05:48:49.610486 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a\": container with ID starting with 6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a not found: ID does not exist" containerID="6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.610518 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a"} err="failed to get container status \"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a\": rpc error: code = NotFound desc = could not find container \"6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a\": container with ID starting with 6901b9bd9e20892d65691ccb223cfe4da9ca370014e71f75c1f73cdcfc7e412a not found: ID does not exist" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.610540 4717 scope.go:117] "RemoveContainer" containerID="0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d" Mar 08 05:48:49 crc kubenswrapper[4717]: E0308 05:48:49.610878 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d\": container with ID starting with 0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d not found: ID does not exist" containerID="0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.610896 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d"} err="failed to get container status \"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d\": rpc error: code = NotFound desc = could not find container \"0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d\": container with ID starting with 0938bea306ad69c2a1d25a9b1c606aec91b4d8723e347dac477c9d181727e42d not found: ID does not exist" Mar 08 05:48:49 crc kubenswrapper[4717]: W0308 05:48:49.694101 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4d403c_6eb6_401c_9b4b_734c6adf3828.slice/crio-32caf1f481e92593386f3670284b97fa058bc57213ffb8ad93ed664b9e9c2bc2 WatchSource:0}: Error finding container 32caf1f481e92593386f3670284b97fa058bc57213ffb8ad93ed664b9e9c2bc2: Status 404 returned error can't find the container with id 32caf1f481e92593386f3670284b97fa058bc57213ffb8ad93ed664b9e9c2bc2 Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.700221 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c49bc6878-t8tg8"] Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789218 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnpzh\" (UniqueName: \"kubernetes.io/projected/8fc8ad72-3a80-4520-8387-11aeb8bca94f-kube-api-access-fnpzh\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789544 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789641 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789693 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fc8ad72-3a80-4520-8387-11aeb8bca94f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.789774 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.801278 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" path="/var/lib/kubelet/pods/3b21c262-66aa-47df-ad60-24b7a43031a3/volumes" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.801869 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fffef01-b5cb-4761-afed-d11e3e65ac1f" path="/var/lib/kubelet/pods/6fffef01-b5cb-4761-afed-d11e3e65ac1f/volumes" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.802502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b525ae-0a3d-41ba-b961-2e1fecce18b9" path="/var/lib/kubelet/pods/f9b525ae-0a3d-41ba-b961-2e1fecce18b9/volumes" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fc8ad72-3a80-4520-8387-11aeb8bca94f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnpzh\" (UniqueName: \"kubernetes.io/projected/8fc8ad72-3a80-4520-8387-11aeb8bca94f-kube-api-access-fnpzh\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.890424 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8fc8ad72-3a80-4520-8387-11aeb8bca94f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.893838 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.894430 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.894893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.897274 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc8ad72-3a80-4520-8387-11aeb8bca94f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:49 crc kubenswrapper[4717]: I0308 05:48:49.915342 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnpzh\" (UniqueName: \"kubernetes.io/projected/8fc8ad72-3a80-4520-8387-11aeb8bca94f-kube-api-access-fnpzh\") pod \"cinder-scheduler-0\" (UID: \"8fc8ad72-3a80-4520-8387-11aeb8bca94f\") " pod="openstack/cinder-scheduler-0" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.103628 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cf759c7cb-qxb65" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.170:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.170:8443: connect: connection refused" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.204958 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.517257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c49bc6878-t8tg8" event={"ID":"bb4d403c-6eb6-401c-9b4b-734c6adf3828","Type":"ContainerStarted","Data":"9fbd285ce844d289994708d1af506751502009b82940da40474cda3caa298b8e"} Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.517723 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.518323 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.518352 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c49bc6878-t8tg8" event={"ID":"bb4d403c-6eb6-401c-9b4b-734c6adf3828","Type":"ContainerStarted","Data":"393677f98e39c60c3c532ed82ce387a3123843ac10bc394a6f360c134de967cc"} Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.518368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c49bc6878-t8tg8" event={"ID":"bb4d403c-6eb6-401c-9b4b-734c6adf3828","Type":"ContainerStarted","Data":"32caf1f481e92593386f3670284b97fa058bc57213ffb8ad93ed664b9e9c2bc2"} Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.518328 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:48:50 crc kubenswrapper[4717]: E0308 05:48:50.518645 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f29ab6b7-97ec-4d9d-ba67-d4abad06de9a)\"" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.550507 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6c49bc6878-t8tg8" podStartSLOduration=2.550486096 podStartE2EDuration="2.550486096s" podCreationTimestamp="2026-03-08 05:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:50.537048025 +0000 UTC m=+1357.454696869" watchObservedRunningTime="2026-03-08 05:48:50.550486096 +0000 UTC m=+1357.468134940" Mar 08 05:48:50 crc kubenswrapper[4717]: I0308 05:48:50.675927 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 05:48:51 crc kubenswrapper[4717]: I0308 05:48:51.533599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8fc8ad72-3a80-4520-8387-11aeb8bca94f","Type":"ContainerStarted","Data":"20ec85ceffe5cf12a5c7722a37514bc403175d4e2be14e0e009df6c564368eb1"} Mar 08 05:48:51 crc kubenswrapper[4717]: I0308 05:48:51.534161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8fc8ad72-3a80-4520-8387-11aeb8bca94f","Type":"ContainerStarted","Data":"ade5b0ba31feebf8e3e4a075bb03e940ec737777c30caff8f9cebbab9e555437"} Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.546748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8fc8ad72-3a80-4520-8387-11aeb8bca94f","Type":"ContainerStarted","Data":"569b537cb8b548ff948837478e2a1292674731f2942b1d988a8e2f35e685b060"} Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.587389 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.587361811 podStartE2EDuration="3.587361811s" podCreationTimestamp="2026-03-08 05:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:48:52.576882313 +0000 UTC m=+1359.494531167" watchObservedRunningTime="2026-03-08 05:48:52.587361811 +0000 UTC m=+1359.505010685" Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.827885 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.879081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-569c95cff8-l9lj5" Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.961199 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.961454 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58f95cdfb-8kctw" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api-log" containerID="cri-o://6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a" gracePeriod=30 Mar 08 05:48:52 crc kubenswrapper[4717]: I0308 05:48:52.961607 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58f95cdfb-8kctw" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api" containerID="cri-o://e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602" gracePeriod=30 Mar 08 05:48:53 crc kubenswrapper[4717]: I0308 05:48:53.560702 4717 generic.go:334] "Generic (PLEG): container finished" podID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerID="6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a" exitCode=143 Mar 08 05:48:53 crc kubenswrapper[4717]: I0308 05:48:53.560711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerDied","Data":"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a"} Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.102106 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.205121 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220126 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle\") pod \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220388 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data\") pod \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220432 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs\") pod \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk8zb\" (UniqueName: \"kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb\") pod \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220736 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom\") pod \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\" (UID: \"205a9ad7-55ee-4a9a-8a92-ae9e15a53705\") " Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.220898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs" (OuterVolumeSpecName: "logs") pod "205a9ad7-55ee-4a9a-8a92-ae9e15a53705" (UID: "205a9ad7-55ee-4a9a-8a92-ae9e15a53705"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.221466 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.229803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "205a9ad7-55ee-4a9a-8a92-ae9e15a53705" (UID: "205a9ad7-55ee-4a9a-8a92-ae9e15a53705"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.229878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb" (OuterVolumeSpecName: "kube-api-access-fk8zb") pod "205a9ad7-55ee-4a9a-8a92-ae9e15a53705" (UID: "205a9ad7-55ee-4a9a-8a92-ae9e15a53705"). InnerVolumeSpecName "kube-api-access-fk8zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.262660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205a9ad7-55ee-4a9a-8a92-ae9e15a53705" (UID: "205a9ad7-55ee-4a9a-8a92-ae9e15a53705"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.278963 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data" (OuterVolumeSpecName: "config-data") pod "205a9ad7-55ee-4a9a-8a92-ae9e15a53705" (UID: "205a9ad7-55ee-4a9a-8a92-ae9e15a53705"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.323546 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.323583 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.323592 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.323600 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk8zb\" (UniqueName: \"kubernetes.io/projected/205a9ad7-55ee-4a9a-8a92-ae9e15a53705-kube-api-access-fk8zb\") on node \"crc\" DevicePath \"\"" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.585036 4717 generic.go:334] "Generic (PLEG): container finished" podID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerID="e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602" exitCode=0 Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.585074 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerDied","Data":"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602"} Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.585099 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58f95cdfb-8kctw" event={"ID":"205a9ad7-55ee-4a9a-8a92-ae9e15a53705","Type":"ContainerDied","Data":"ccc4ca52f584495eb6a846fea4de4f67878aa7299a5ccb3479dbed60beb40ebf"} Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.585114 4717 scope.go:117] "RemoveContainer" containerID="e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.585212 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58f95cdfb-8kctw" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.615643 4717 scope.go:117] "RemoveContainer" containerID="6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.623565 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.631261 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58f95cdfb-8kctw"] Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.646474 4717 scope.go:117] "RemoveContainer" containerID="e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602" Mar 08 05:48:55 crc kubenswrapper[4717]: E0308 05:48:55.646922 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602\": container with ID starting with e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602 not found: ID does not exist" containerID="e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.646962 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602"} err="failed to get container status \"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602\": rpc error: code = NotFound desc = could not find container \"e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602\": container with ID starting with e45494b9280f9394545b39b322a9dea07525e5c4cd50c97b7202944592b8b602 not found: ID does not exist" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.646990 4717 scope.go:117] "RemoveContainer" containerID="6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a" Mar 08 05:48:55 crc kubenswrapper[4717]: E0308 05:48:55.647502 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a\": container with ID starting with 6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a not found: ID does not exist" containerID="6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.647534 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a"} err="failed to get container status \"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a\": rpc error: code = NotFound desc = could not find container \"6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a\": container with ID starting with 6852ae8e7cb0847d77c73c3dd557c55983379ebe2fd8aca3ed13fd4a1058214a not found: ID does not exist" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.681253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 05:48:55 crc kubenswrapper[4717]: I0308 05:48:55.795390 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" path="/var/lib/kubelet/pods/205a9ad7-55ee-4a9a-8a92-ae9e15a53705/volumes" Mar 08 05:48:56 crc kubenswrapper[4717]: I0308 05:48:56.871854 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5799fc9f64-fmph6" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.739424 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 05:48:59 crc kubenswrapper[4717]: E0308 05:48:59.764531 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.764642 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api" Mar 08 05:48:59 crc kubenswrapper[4717]: E0308 05:48:59.764779 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api-log" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.764891 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api-log" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.765431 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api-log" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.765544 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.766346 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.766586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.771025 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.771342 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.771443 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5phf9" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.814324 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.814563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4vr\" (UniqueName: \"kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.814816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.814990 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.842189 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58f95cdfb-8kctw" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.185:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.842263 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58f95cdfb-8kctw" podUID="205a9ad7-55ee-4a9a-8a92-ae9e15a53705" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.185:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.915638 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4vr\" (UniqueName: \"kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.915717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.915772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.915835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.917053 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.924580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.925513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:48:59 crc kubenswrapper[4717]: I0308 05:48:59.943580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4vr\" (UniqueName: \"kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr\") pod \"openstackclient\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.103223 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cf759c7cb-qxb65" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.170:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.170:8443: connect: connection refused" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.103340 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.109661 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.115851 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.120854 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.177443 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.178723 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.191436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.222312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.222353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29kpq\" (UniqueName: \"kubernetes.io/projected/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-kube-api-access-29kpq\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.222385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.222450 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: E0308 05:49:00.263920 4717 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 05:49:00 crc kubenswrapper[4717]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1a6ab12-b393-4557-a08f-0fada594901a_0(e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d" Netns:"/var/run/netns/54698e59-2377-4cc0-a665-7e027781b269" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d;K8S_POD_UID=e1a6ab12-b393-4557-a08f-0fada594901a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1a6ab12-b393-4557-a08f-0fada594901a]: expected pod UID "e1a6ab12-b393-4557-a08f-0fada594901a" but got "9a66e3e0-63d9-4ca4-ab60-8a842f37cc68" from Kube API Mar 08 05:49:00 crc kubenswrapper[4717]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 05:49:00 crc kubenswrapper[4717]: > Mar 08 05:49:00 crc kubenswrapper[4717]: E0308 05:49:00.263982 4717 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 05:49:00 crc kubenswrapper[4717]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e1a6ab12-b393-4557-a08f-0fada594901a_0(e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d" Netns:"/var/run/netns/54698e59-2377-4cc0-a665-7e027781b269" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e7b2bb385ad236e19bf9b7f8882b9d0e08dede89f86fc185196370871c8ef08d;K8S_POD_UID=e1a6ab12-b393-4557-a08f-0fada594901a" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e1a6ab12-b393-4557-a08f-0fada594901a]: expected pod UID "e1a6ab12-b393-4557-a08f-0fada594901a" but got "9a66e3e0-63d9-4ca4-ab60-8a842f37cc68" from Kube API Mar 08 05:49:00 crc kubenswrapper[4717]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 05:49:00 crc kubenswrapper[4717]: > pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.330375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.330429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29kpq\" (UniqueName: \"kubernetes.io/projected/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-kube-api-access-29kpq\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.330493 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.330534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.331435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.349142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.349159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-openstack-config-secret\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.357365 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29kpq\" (UniqueName: \"kubernetes.io/projected/9a66e3e0-63d9-4ca4-ab60-8a842f37cc68-kube-api-access-29kpq\") pod \"openstackclient\" (UID: \"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68\") " pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.378551 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.650301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.659245 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1a6ab12-b393-4557-a08f-0fada594901a" podUID="9a66e3e0-63d9-4ca4-ab60-8a842f37cc68" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.659745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.668862 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.736433 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret\") pod \"e1a6ab12-b393-4557-a08f-0fada594901a\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.736597 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config\") pod \"e1a6ab12-b393-4557-a08f-0fada594901a\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.736666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp4vr\" (UniqueName: \"kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr\") pod \"e1a6ab12-b393-4557-a08f-0fada594901a\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.736755 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle\") pod \"e1a6ab12-b393-4557-a08f-0fada594901a\" (UID: \"e1a6ab12-b393-4557-a08f-0fada594901a\") " Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.737137 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e1a6ab12-b393-4557-a08f-0fada594901a" (UID: "e1a6ab12-b393-4557-a08f-0fada594901a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.738050 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.741069 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr" (OuterVolumeSpecName: "kube-api-access-bp4vr") pod "e1a6ab12-b393-4557-a08f-0fada594901a" (UID: "e1a6ab12-b393-4557-a08f-0fada594901a"). InnerVolumeSpecName "kube-api-access-bp4vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.741062 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1a6ab12-b393-4557-a08f-0fada594901a" (UID: "e1a6ab12-b393-4557-a08f-0fada594901a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.745880 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e1a6ab12-b393-4557-a08f-0fada594901a" (UID: "e1a6ab12-b393-4557-a08f-0fada594901a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.840401 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp4vr\" (UniqueName: \"kubernetes.io/projected/e1a6ab12-b393-4557-a08f-0fada594901a-kube-api-access-bp4vr\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.840428 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:00 crc kubenswrapper[4717]: I0308 05:49:00.840437 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e1a6ab12-b393-4557-a08f-0fada594901a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:01 crc kubenswrapper[4717]: I0308 05:49:01.108281 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 05:49:01 crc kubenswrapper[4717]: W0308 05:49:01.118349 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a66e3e0_63d9_4ca4_ab60_8a842f37cc68.slice/crio-9b67c3d214057493612672780e3d7ec651d1a4f210be822c74237b8d64210328 WatchSource:0}: Error finding container 9b67c3d214057493612672780e3d7ec651d1a4f210be822c74237b8d64210328: Status 404 returned error can't find the container with id 9b67c3d214057493612672780e3d7ec651d1a4f210be822c74237b8d64210328 Mar 08 05:49:01 crc kubenswrapper[4717]: I0308 05:49:01.665423 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68","Type":"ContainerStarted","Data":"9b67c3d214057493612672780e3d7ec651d1a4f210be822c74237b8d64210328"} Mar 08 05:49:01 crc kubenswrapper[4717]: I0308 05:49:01.665455 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 05:49:01 crc kubenswrapper[4717]: I0308 05:49:01.668233 4717 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e1a6ab12-b393-4557-a08f-0fada594901a" podUID="9a66e3e0-63d9-4ca4-ab60-8a842f37cc68" Mar 08 05:49:01 crc kubenswrapper[4717]: I0308 05:49:01.793905 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a6ab12-b393-4557-a08f-0fada594901a" path="/var/lib/kubelet/pods/e1a6ab12-b393-4557-a08f-0fada594901a/volumes" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.619496 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-cc47695ff-btlzb"] Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.621432 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.623542 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.628570 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.628843 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.632021 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-cc47695ff-btlzb"] Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.690532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-run-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.690668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-log-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.690736 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-config-data\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.690922 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jr7t\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-kube-api-access-6jr7t\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.691004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-etc-swift\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.691077 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-internal-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.691157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-public-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.691191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-combined-ca-bundle\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.790176 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-internal-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792481 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-public-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-combined-ca-bundle\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792545 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-run-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792585 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-log-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-config-data\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jr7t\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-kube-api-access-6jr7t\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.792710 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-etc-swift\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.793733 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-log-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.794007 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-run-httpd\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.801023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-etc-swift\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.807190 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-public-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.814848 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-internal-tls-certs\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.816507 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-combined-ca-bundle\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.825472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jr7t\" (UniqueName: \"kubernetes.io/projected/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-kube-api-access-6jr7t\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.839839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6-config-data\") pod \"swift-proxy-cc47695ff-btlzb\" (UID: \"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6\") " pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:03 crc kubenswrapper[4717]: I0308 05:49:03.951652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.120525 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.120878 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.244508 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.245085 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-central-agent" containerID="cri-o://5580a3d494f5319249f010871d7ffb376fccd5e15eb310977c1c5e35ad842a69" gracePeriod=30 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.245811 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="proxy-httpd" containerID="cri-o://285a10d6d26d2c51bd96b1ce9eb81145ab74d2c4266a5c555ccb56be2c673922" gracePeriod=30 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.245984 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="sg-core" containerID="cri-o://d55911cc78bf6c0b790d72585f2d54d9b9d7471a9d4549f5b41bcacd858b0470" gracePeriod=30 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.246119 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-notification-agent" containerID="cri-o://c4e826da2cc6d7ca8173344c8c10d03a82278d885982fc2c702c7e03e4a551ab" gracePeriod=30 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.284640 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.302243 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.302862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs" (OuterVolumeSpecName: "logs") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.302943 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.303129 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.303838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k2jb\" (UniqueName: \"kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.303867 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.303905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.303926 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key\") pod \"a68b99b9-3abd-4e46-b116-c740daf70c8f\" (UID: \"a68b99b9-3abd-4e46-b116-c740daf70c8f\") " Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.304319 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a68b99b9-3abd-4e46-b116-c740daf70c8f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.309719 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb" (OuterVolumeSpecName: "kube-api-access-7k2jb") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "kube-api-access-7k2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.313964 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.352165 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.189:3000/\": read tcp 10.217.0.2:39040->10.217.0.189:3000: read: connection reset by peer" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.354905 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.383632 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.388457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data" (OuterVolumeSpecName: "config-data") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.391109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts" (OuterVolumeSpecName: "scripts") pod "a68b99b9-3abd-4e46-b116-c740daf70c8f" (UID: "a68b99b9-3abd-4e46-b116-c740daf70c8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406027 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406056 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k2jb\" (UniqueName: \"kubernetes.io/projected/a68b99b9-3abd-4e46-b116-c740daf70c8f-kube-api-access-7k2jb\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406069 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406077 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406086 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a68b99b9-3abd-4e46-b116-c740daf70c8f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.406097 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a68b99b9-3abd-4e46-b116-c740daf70c8f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.541372 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-cc47695ff-btlzb"] Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.695333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerStarted","Data":"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698614 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerID="285a10d6d26d2c51bd96b1ce9eb81145ab74d2c4266a5c555ccb56be2c673922" exitCode=0 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698643 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerID="d55911cc78bf6c0b790d72585f2d54d9b9d7471a9d4549f5b41bcacd858b0470" exitCode=2 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698652 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerID="5580a3d494f5319249f010871d7ffb376fccd5e15eb310977c1c5e35ad842a69" exitCode=0 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerDied","Data":"285a10d6d26d2c51bd96b1ce9eb81145ab74d2c4266a5c555ccb56be2c673922"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698765 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerDied","Data":"d55911cc78bf6c0b790d72585f2d54d9b9d7471a9d4549f5b41bcacd858b0470"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.698780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerDied","Data":"5580a3d494f5319249f010871d7ffb376fccd5e15eb310977c1c5e35ad842a69"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.700064 4717 generic.go:334] "Generic (PLEG): container finished" podID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerID="35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200" exitCode=137 Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.700121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerDied","Data":"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.700145 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cf759c7cb-qxb65" event={"ID":"a68b99b9-3abd-4e46-b116-c740daf70c8f","Type":"ContainerDied","Data":"3b812ea7dff38aec51591b7b4b9d2de2ad9d1d7c5cb7c572f9bf8c1b89923f4f"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.700161 4717 scope.go:117] "RemoveContainer" containerID="a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.700259 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cf759c7cb-qxb65" Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.701193 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-cc47695ff-btlzb" event={"ID":"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6","Type":"ContainerStarted","Data":"7c8ced460d6dc917f90885899b7f4e8fe22c6b26e93daf95be829106a81a4a28"} Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.741228 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.752124 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cf759c7cb-qxb65"] Mar 08 05:49:04 crc kubenswrapper[4717]: I0308 05:49:04.875467 4717 scope.go:117] "RemoveContainer" containerID="35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.015660 4717 scope.go:117] "RemoveContainer" containerID="a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67" Mar 08 05:49:05 crc kubenswrapper[4717]: E0308 05:49:05.016645 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67\": container with ID starting with a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67 not found: ID does not exist" containerID="a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.016678 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67"} err="failed to get container status \"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67\": rpc error: code = NotFound desc = could not find container \"a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67\": container with ID starting with a7d7c5e241f44ace5d86c6f27083e6166ee5cccf87258c3acd92b49f311fff67 not found: ID does not exist" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.016713 4717 scope.go:117] "RemoveContainer" containerID="35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200" Mar 08 05:49:05 crc kubenswrapper[4717]: E0308 05:49:05.016994 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200\": container with ID starting with 35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200 not found: ID does not exist" containerID="35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.017045 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200"} err="failed to get container status \"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200\": rpc error: code = NotFound desc = could not find container \"35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200\": container with ID starting with 35d86f1fb6244cd980a55d5caa12523d9ed01decb09199845bf234da2967d200 not found: ID does not exist" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.722670 4717 generic.go:334] "Generic (PLEG): container finished" podID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerID="c4e826da2cc6d7ca8173344c8c10d03a82278d885982fc2c702c7e03e4a551ab" exitCode=0 Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.722749 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerDied","Data":"c4e826da2cc6d7ca8173344c8c10d03a82278d885982fc2c702c7e03e4a551ab"} Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.725947 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-cc47695ff-btlzb" event={"ID":"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6","Type":"ContainerStarted","Data":"ea3c2743718fc7ad7449b7ebda3f1c2f9166642c655ba735d16e31806b27b30e"} Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.725985 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-cc47695ff-btlzb" event={"ID":"050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6","Type":"ContainerStarted","Data":"9a043b18b6e72fd528dda1cbc03dc92e871e64df4275074ba506d8a0805b8056"} Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.726208 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.726273 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.752945 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-cc47695ff-btlzb" podStartSLOduration=2.7529315309999998 podStartE2EDuration="2.752931531s" podCreationTimestamp="2026-03-08 05:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:05.743819836 +0000 UTC m=+1372.661468680" watchObservedRunningTime="2026-03-08 05:49:05.752931531 +0000 UTC m=+1372.670580375" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.757643 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.800135 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" path="/var/lib/kubelet/pods/a68b99b9-3abd-4e46-b116-c740daf70c8f/volumes" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933311 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933491 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933510 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kc99\" (UniqueName: \"kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.933700 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd\") pod \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\" (UID: \"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37\") " Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.934032 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.934241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.934523 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.934547 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.941961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99" (OuterVolumeSpecName: "kube-api-access-9kc99") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "kube-api-access-9kc99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.942522 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts" (OuterVolumeSpecName: "scripts") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:05 crc kubenswrapper[4717]: I0308 05:49:05.974195 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.026515 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.036326 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.036358 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.036368 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.036377 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kc99\" (UniqueName: \"kubernetes.io/projected/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-kube-api-access-9kc99\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.046837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data" (OuterVolumeSpecName: "config-data") pod "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" (UID: "0fcf6afa-3577-4f0c-9a71-8133eb6c7f37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.138208 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.737490 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.737488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fcf6afa-3577-4f0c-9a71-8133eb6c7f37","Type":"ContainerDied","Data":"db09693a7a76ac4d53b1b532eb993adb342e0de9056a048aeb782655b1274bf4"} Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.737615 4717 scope.go:117] "RemoveContainer" containerID="285a10d6d26d2c51bd96b1ce9eb81145ab74d2c4266a5c555ccb56be2c673922" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.801260 4717 scope.go:117] "RemoveContainer" containerID="d55911cc78bf6c0b790d72585f2d54d9b9d7471a9d4549f5b41bcacd858b0470" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.820773 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.837972 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.843832 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844268 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844287 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844303 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-notification-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844310 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-notification-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844324 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="sg-core" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844329 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="sg-core" Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844345 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon-log" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844350 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon-log" Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844359 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-central-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844364 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-central-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: E0308 05:49:06.844385 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="proxy-httpd" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844391 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="proxy-httpd" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844582 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-central-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844597 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="ceilometer-notification-agent" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844613 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon-log" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844628 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="proxy-httpd" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844637 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" containerName="sg-core" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.844649 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68b99b9-3abd-4e46-b116-c740daf70c8f" containerName="horizon" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.846434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.847855 4717 scope.go:117] "RemoveContainer" containerID="c4e826da2cc6d7ca8173344c8c10d03a82278d885982fc2c702c7e03e4a551ab" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.851095 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.851258 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.853142 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.854529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.854582 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.854616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.854749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwvj\" (UniqueName: \"kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.854833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.855049 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.855070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.874272 4717 scope.go:117] "RemoveContainer" containerID="5580a3d494f5319249f010871d7ffb376fccd5e15eb310977c1c5e35ad842a69" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.955778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.955820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.955844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.956408 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.956845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwvj\" (UniqueName: \"kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.956888 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.956995 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.957019 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.957204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.960159 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.960325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.961720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.971169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:06 crc kubenswrapper[4717]: I0308 05:49:06.974381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwvj\" (UniqueName: \"kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj\") pod \"ceilometer-0\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " pod="openstack/ceilometer-0" Mar 08 05:49:07 crc kubenswrapper[4717]: I0308 05:49:07.166769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:07 crc kubenswrapper[4717]: I0308 05:49:07.624907 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:07 crc kubenswrapper[4717]: I0308 05:49:07.645816 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:07 crc kubenswrapper[4717]: I0308 05:49:07.855534 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fcf6afa-3577-4f0c-9a71-8133eb6c7f37" path="/var/lib/kubelet/pods/0fcf6afa-3577-4f0c-9a71-8133eb6c7f37/volumes" Mar 08 05:49:08 crc kubenswrapper[4717]: I0308 05:49:08.479739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:08 crc kubenswrapper[4717]: I0308 05:49:08.528876 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:08 crc kubenswrapper[4717]: I0308 05:49:08.756527 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:08 crc kubenswrapper[4717]: I0308 05:49:08.785997 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.131373 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.132929 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerName="watcher-applier" containerID="cri-o://15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" gracePeriod=30 Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.158549 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.176004 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.176353 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api" containerID="cri-o://79e11794d337c00b54012f0999785dd3251bf62880c9a5bf907f504347aa5dbd" gracePeriod=30 Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.176608 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api-log" containerID="cri-o://f13035681b8f54a7e25632c6689dfd30c854f5f2a6aa27e142ef2bb93a7ef77b" gracePeriod=30 Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.811917 4717 generic.go:334] "Generic (PLEG): container finished" podID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerID="f13035681b8f54a7e25632c6689dfd30c854f5f2a6aa27e142ef2bb93a7ef77b" exitCode=143 Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.812313 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" containerID="cri-o://b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6" gracePeriod=30 Mar 08 05:49:11 crc kubenswrapper[4717]: I0308 05:49:11.812003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerDied","Data":"f13035681b8f54a7e25632c6689dfd30c854f5f2a6aa27e142ef2bb93a7ef77b"} Mar 08 05:49:12 crc kubenswrapper[4717]: I0308 05:49:12.827023 4717 generic.go:334] "Generic (PLEG): container finished" podID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerID="79e11794d337c00b54012f0999785dd3251bf62880c9a5bf907f504347aa5dbd" exitCode=0 Mar 08 05:49:12 crc kubenswrapper[4717]: I0308 05:49:12.827066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerDied","Data":"79e11794d337c00b54012f0999785dd3251bf62880c9a5bf907f504347aa5dbd"} Mar 08 05:49:12 crc kubenswrapper[4717]: I0308 05:49:12.913432 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:12 crc kubenswrapper[4717]: I0308 05:49:12.913746 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-log" containerID="cri-o://9af55bd54a70d80bac5b48e8d5331e1d3df92c4f2ea98306d143c2816018a4ea" gracePeriod=30 Mar 08 05:49:12 crc kubenswrapper[4717]: I0308 05:49:12.914009 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-httpd" containerID="cri-o://75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319" gracePeriod=30 Mar 08 05:49:13 crc kubenswrapper[4717]: E0308 05:49:13.584768 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 08 05:49:13 crc kubenswrapper[4717]: E0308 05:49:13.594647 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 08 05:49:13 crc kubenswrapper[4717]: E0308 05:49:13.599779 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 08 05:49:13 crc kubenswrapper[4717]: E0308 05:49:13.599862 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerName="watcher-applier" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.605365 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lwr4v"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.606763 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.617631 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.617711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwh76\" (UniqueName: \"kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.629893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lwr4v"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.719370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.719431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwh76\" (UniqueName: \"kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.720440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.745230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwh76\" (UniqueName: \"kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76\") pod \"nova-api-db-create-lwr4v\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.822641 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jpsxr"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.823799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.860237 4717 generic.go:334] "Generic (PLEG): container finished" podID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerID="9af55bd54a70d80bac5b48e8d5331e1d3df92c4f2ea98306d143c2816018a4ea" exitCode=143 Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.860284 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerDied","Data":"9af55bd54a70d80bac5b48e8d5331e1d3df92c4f2ea98306d143c2816018a4ea"} Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.865982 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jpsxr"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.903204 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-30a0-account-create-update-7xzkx"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.911576 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.914148 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.923298 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.927967 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30a0-account-create-update-7xzkx"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.956010 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.961354 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-cc47695ff-btlzb" Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.980473 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wwbxz"] Mar 08 05:49:13 crc kubenswrapper[4717]: I0308 05:49:13.994485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.000141 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwbxz"] Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.032199 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxlt\" (UniqueName: \"kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.032390 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.032542 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.033659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdxw4\" (UniqueName: \"kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.058820 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": dial tcp 10.217.0.175:9292: connect: connection refused" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.059099 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": dial tcp 10.217.0.175:9292: connect: connection refused" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.078675 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ef6d-account-create-update-zrg6l"] Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.080189 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.085004 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.114968 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ef6d-account-create-update-zrg6l"] Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.134902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgxlt\" (UniqueName: \"kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.134969 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.134991 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.135017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.135055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l75g\" (UniqueName: \"kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.135091 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdxw4\" (UniqueName: \"kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.136652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.136855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.158579 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgxlt\" (UniqueName: \"kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt\") pod \"nova-api-30a0-account-create-update-7xzkx\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.159120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdxw4\" (UniqueName: \"kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4\") pod \"nova-cell0-db-create-jpsxr\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.202793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.237805 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.237869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l75g\" (UniqueName: \"kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.237980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.238452 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44hlt\" (UniqueName: \"kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.240064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.268945 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.271279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l75g\" (UniqueName: \"kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g\") pod \"nova-cell1-db-create-wwbxz\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.341100 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d06f-account-create-update-8vf24"] Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.342540 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.343137 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44hlt\" (UniqueName: \"kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.343326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.343962 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.348624 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.354088 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d06f-account-create-update-8vf24"] Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.365050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44hlt\" (UniqueName: \"kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt\") pod \"nova-cell0-ef6d-account-create-update-zrg6l\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.443780 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.445793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.445940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhfg\" (UniqueName: \"kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.461191 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.475406 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.553361 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.553595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crhfg\" (UniqueName: \"kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.553846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.554707 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.572363 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhfg\" (UniqueName: \"kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg\") pod \"nova-cell1-d06f-account-create-update-8vf24\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.654629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.654758 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzdt\" (UniqueName: \"kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.654868 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.654902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.654961 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.655009 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle\") pod \"773c0536-3f49-45c0-ae25-88e62b1c97e4\" (UID: \"773c0536-3f49-45c0-ae25-88e62b1c97e4\") " Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.663660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs" (OuterVolumeSpecName: "logs") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.666129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.679361 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt" (OuterVolumeSpecName: "kube-api-access-6kzdt") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "kube-api-access-6kzdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.758845 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773c0536-3f49-45c0-ae25-88e62b1c97e4-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.758871 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kzdt\" (UniqueName: \"kubernetes.io/projected/773c0536-3f49-45c0-ae25-88e62b1c97e4-kube-api-access-6kzdt\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.768950 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.777135 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.795774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.795864 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data" (OuterVolumeSpecName: "config-data") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.799090 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "773c0536-3f49-45c0-ae25-88e62b1c97e4" (UID: "773c0536-3f49-45c0-ae25-88e62b1c97e4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.864581 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.865050 4717 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.865065 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.865078 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:14 crc kubenswrapper[4717]: I0308 05:49:14.865103 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773c0536-3f49-45c0-ae25-88e62b1c97e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.044099 4717 generic.go:334] "Generic (PLEG): container finished" podID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerID="75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319" exitCode=0 Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.044173 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerDied","Data":"75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319"} Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.052006 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.066011 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lwr4v"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.070126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"773c0536-3f49-45c0-ae25-88e62b1c97e4","Type":"ContainerDied","Data":"0033d2c1dfe31597618634c81450476366fe350da3becf8f7ff928b0f9390d41"} Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.070171 4717 scope.go:117] "RemoveContainer" containerID="79e11794d337c00b54012f0999785dd3251bf62880c9a5bf907f504347aa5dbd" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.070302 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: W0308 05:49:15.098995 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ec8299_e288_40a1_882a_0980ef3b21d4.slice/crio-3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d WatchSource:0}: Error finding container 3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d: Status 404 returned error can't find the container with id 3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.099654 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerStarted","Data":"077aa7c9feb4b99c76efa4485f286cda23e577f6515f0f7d1afea5bbcddb0e5b"} Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.133548 4717 generic.go:334] "Generic (PLEG): container finished" podID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerID="15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" exitCode=0 Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.135924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6be711ba-e0dc-4d84-a9d3-910819cc02e3","Type":"ContainerDied","Data":"15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691"} Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.186239 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.187804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.187854 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.187918 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.187964 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.187984 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.188014 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjc26\" (UniqueName: \"kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.188101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.188134 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data\") pod \"6345fc76-e42d-4a13-90d2-c2bd5135f073\" (UID: \"6345fc76-e42d-4a13-90d2-c2bd5135f073\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.212370 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs" (OuterVolumeSpecName: "logs") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.215813 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.234761 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.235967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.236080 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts" (OuterVolumeSpecName: "scripts") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.254086 4717 scope.go:117] "RemoveContainer" containerID="f13035681b8f54a7e25632c6689dfd30c854f5f2a6aa27e142ef2bb93a7ef77b" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.260924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26" (OuterVolumeSpecName: "kube-api-access-fjc26") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "kube-api-access-fjc26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.270004 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:15 crc kubenswrapper[4717]: E0308 05:49:15.270824 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.270927 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api" Mar 08 05:49:15 crc kubenswrapper[4717]: E0308 05:49:15.271043 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-log" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.271144 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-log" Mar 08 05:49:15 crc kubenswrapper[4717]: E0308 05:49:15.271271 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-httpd" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.271371 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-httpd" Mar 08 05:49:15 crc kubenswrapper[4717]: E0308 05:49:15.271495 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api-log" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.271598 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api-log" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.272058 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-log" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.272150 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api-log" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.272227 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" containerName="watcher-api" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.272324 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" containerName="glance-httpd" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.273608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.276657 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.276848 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.276979 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.290499 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjc26\" (UniqueName: \"kubernetes.io/projected/6345fc76-e42d-4a13-90d2-c2bd5135f073-kube-api-access-fjc26\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.290534 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.290544 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.290552 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.290561 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6345fc76-e42d-4a13-90d2-c2bd5135f073-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.321989 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393500 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-config-data\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkg7\" (UniqueName: \"kubernetes.io/projected/acde9c29-0910-40cc-9da8-06a566c67b4c-kube-api-access-stkg7\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acde9c29-0910-40cc-9da8-06a566c67b4c-logs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393865 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.393909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.403153 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.425486 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.436105 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.464859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.484673 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data" (OuterVolumeSpecName: "config-data") pod "6345fc76-e42d-4a13-90d2-c2bd5135f073" (UID: "6345fc76-e42d-4a13-90d2-c2bd5135f073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498287 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498399 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-config-data\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkg7\" (UniqueName: \"kubernetes.io/projected/acde9c29-0910-40cc-9da8-06a566c67b4c-kube-api-access-stkg7\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acde9c29-0910-40cc-9da8-06a566c67b4c-logs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498542 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498553 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498562 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.498571 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6345fc76-e42d-4a13-90d2-c2bd5135f073-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.502540 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-public-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.504074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acde9c29-0910-40cc-9da8-06a566c67b4c-logs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.506590 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.508215 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-config-data\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.508397 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jpsxr"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.508788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.509801 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/acde9c29-0910-40cc-9da8-06a566c67b4c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.514473 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wwbxz"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.521961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkg7\" (UniqueName: \"kubernetes.io/projected/acde9c29-0910-40cc-9da8-06a566c67b4c-kube-api-access-stkg7\") pod \"watcher-api-0\" (UID: \"acde9c29-0910-40cc-9da8-06a566c67b4c\") " pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.522917 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30a0-account-create-update-7xzkx"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.599897 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle\") pod \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.600237 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md6lk\" (UniqueName: \"kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk\") pod \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.600318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data\") pod \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.600339 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs\") pod \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\" (UID: \"6be711ba-e0dc-4d84-a9d3-910819cc02e3\") " Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.601032 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs" (OuterVolumeSpecName: "logs") pod "6be711ba-e0dc-4d84-a9d3-910819cc02e3" (UID: "6be711ba-e0dc-4d84-a9d3-910819cc02e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.604542 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk" (OuterVolumeSpecName: "kube-api-access-md6lk") pod "6be711ba-e0dc-4d84-a9d3-910819cc02e3" (UID: "6be711ba-e0dc-4d84-a9d3-910819cc02e3"). InnerVolumeSpecName "kube-api-access-md6lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.638954 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.660494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6be711ba-e0dc-4d84-a9d3-910819cc02e3" (UID: "6be711ba-e0dc-4d84-a9d3-910819cc02e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.703020 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md6lk\" (UniqueName: \"kubernetes.io/projected/6be711ba-e0dc-4d84-a9d3-910819cc02e3-kube-api-access-md6lk\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.703047 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6be711ba-e0dc-4d84-a9d3-910819cc02e3-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.703064 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.725835 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-665f758875-jsp86" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.737003 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d06f-account-create-update-8vf24"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.754580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data" (OuterVolumeSpecName: "config-data") pod "6be711ba-e0dc-4d84-a9d3-910819cc02e3" (UID: "6be711ba-e0dc-4d84-a9d3-910819cc02e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:15 crc kubenswrapper[4717]: W0308 05:49:15.761228 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod413d3437_74ee_4793_9088_77fac53e4d7c.slice/crio-6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745 WatchSource:0}: Error finding container 6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745: Status 404 returned error can't find the container with id 6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745 Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.809223 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6be711ba-e0dc-4d84-a9d3-910819cc02e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.834668 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773c0536-3f49-45c0-ae25-88e62b1c97e4" path="/var/lib/kubelet/pods/773c0536-3f49-45c0-ae25-88e62b1c97e4/volumes" Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.835551 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.835751 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fd6c6f4-sgbls" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-api" containerID="cri-o://b6b164a88de189fae0dae7303712cb2904ac004f920d737bdf1c3d765a634eca" gracePeriod=30 Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.838127 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6fd6c6f4-sgbls" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-httpd" containerID="cri-o://3e62a802a83a0fd7cc70d85b5b7b1816de1c84f06b89d893888a3028192bccb3" gracePeriod=30 Mar 08 05:49:15 crc kubenswrapper[4717]: I0308 05:49:15.857567 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ef6d-account-create-update-zrg6l"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.160174 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" event={"ID":"46ffbca0-f2a1-4c6b-8594-996db23783f2","Type":"ContainerStarted","Data":"8e9cec116610b1b4e3fa6e7f9900c55869ecc326580a6b29a48cfae0ebda41f9"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.179155 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30a0-account-create-update-7xzkx" event={"ID":"2d25b2af-6a1d-4145-8627-5ba8338bcbef","Type":"ContainerStarted","Data":"18228d80281d2a60ffcf4c56d401c6cb7b582c8dabab1a6352f8c7e35f831ade"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.179198 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30a0-account-create-update-7xzkx" event={"ID":"2d25b2af-6a1d-4145-8627-5ba8338bcbef","Type":"ContainerStarted","Data":"8b10fdc4fded854bf6bc11e3bb3a2e70b661154b9e0668806e0ee4d46c966904"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.195804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6345fc76-e42d-4a13-90d2-c2bd5135f073","Type":"ContainerDied","Data":"ff8a472ff8d216e15089e5c35fb29cfbc3f9d5958266b97e84c498518a10c9cd"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.195850 4717 scope.go:117] "RemoveContainer" containerID="75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.195971 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.203471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerStarted","Data":"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.208268 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.213203 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-30a0-account-create-update-7xzkx" podStartSLOduration=3.213184842 podStartE2EDuration="3.213184842s" podCreationTimestamp="2026-03-08 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:16.194166132 +0000 UTC m=+1383.111814976" watchObservedRunningTime="2026-03-08 05:49:16.213184842 +0000 UTC m=+1383.130833686" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.216457 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerID="3e62a802a83a0fd7cc70d85b5b7b1816de1c84f06b89d893888a3028192bccb3" exitCode=0 Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.216670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerDied","Data":"3e62a802a83a0fd7cc70d85b5b7b1816de1c84f06b89d893888a3028192bccb3"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.221549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpsxr" event={"ID":"e5702b49-8d23-42f8-a162-6783a3eb4a53","Type":"ContainerStarted","Data":"f721adaa689443e873da9f195a8ac787b708bb62e59ebc9429a0b488f16dec95"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.221591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpsxr" event={"ID":"e5702b49-8d23-42f8-a162-6783a3eb4a53","Type":"ContainerStarted","Data":"c444d6d4472469a4f92c47d131dbefc0a8e7ca0b25e6346ef7ae2e56784e9112"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.259744 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.264534 4717 scope.go:117] "RemoveContainer" containerID="9af55bd54a70d80bac5b48e8d5331e1d3df92c4f2ea98306d143c2816018a4ea" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.264777 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" event={"ID":"413d3437-74ee-4793-9088-77fac53e4d7c","Type":"ContainerStarted","Data":"6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.279313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6be711ba-e0dc-4d84-a9d3-910819cc02e3","Type":"ContainerDied","Data":"52cee56da04edbfd0799ffef5e0f21c28e88a1592e4cd394ea7ffbccc5ea89a2"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.279410 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: W0308 05:49:16.280000 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacde9c29_0910_40cc_9da8_06a566c67b4c.slice/crio-e0c4c7e3a25df5a81da9562316c7c3fec14cd31c124d917f3e442d0b0a986047 WatchSource:0}: Error finding container e0c4c7e3a25df5a81da9562316c7c3fec14cd31c124d917f3e442d0b0a986047: Status 404 returned error can't find the container with id e0c4c7e3a25df5a81da9562316c7c3fec14cd31c124d917f3e442d0b0a986047 Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.283522 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.292942 4717 generic.go:334] "Generic (PLEG): container finished" podID="63ec8299-e288-40a1-882a-0980ef3b21d4" containerID="a1b2881a85ed3fb26244d840424001ed464395dbc12b2d3b1484795815b8486a" exitCode=0 Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.293001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lwr4v" event={"ID":"63ec8299-e288-40a1-882a-0980ef3b21d4","Type":"ContainerDied","Data":"a1b2881a85ed3fb26244d840424001ed464395dbc12b2d3b1484795815b8486a"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.293023 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lwr4v" event={"ID":"63ec8299-e288-40a1-882a-0980ef3b21d4","Type":"ContainerStarted","Data":"3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.293635 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jpsxr" podStartSLOduration=3.293616502 podStartE2EDuration="3.293616502s" podCreationTimestamp="2026-03-08 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:16.249445759 +0000 UTC m=+1383.167094603" watchObservedRunningTime="2026-03-08 05:49:16.293616502 +0000 UTC m=+1383.211265346" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.304325 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9a66e3e0-63d9-4ca4-ab60-8a842f37cc68","Type":"ContainerStarted","Data":"613ea70b4eeae5338fcc0ab540e42d072c88674a769674dd07f9d330b4477ae5"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.318128 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: E0308 05:49:16.318534 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerName="watcher-applier" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.318551 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerName="watcher-applier" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.318770 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" containerName="watcher-applier" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.320445 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwbxz" event={"ID":"75976964-499c-4d15-937e-4921f1b16150","Type":"ContainerStarted","Data":"8a0cb418f30515d9cca8f3fc5b3c8e605b366bb1382b7d5002e81b803047f17e"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.320473 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwbxz" event={"ID":"75976964-499c-4d15-937e-4921f1b16150","Type":"ContainerStarted","Data":"12ea192639a4903fc0696b6cfb8681539d6c096326c9cd5181714dd66fbc0a67"} Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.320571 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.324397 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.324737 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.351908 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.369213 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.150743102 podStartE2EDuration="16.36830658s" podCreationTimestamp="2026-03-08 05:49:00 +0000 UTC" firstStartedPulling="2026-03-08 05:49:01.120992076 +0000 UTC m=+1368.038640920" lastFinishedPulling="2026-03-08 05:49:14.338555554 +0000 UTC m=+1381.256204398" observedRunningTime="2026-03-08 05:49:16.325180513 +0000 UTC m=+1383.242829357" watchObservedRunningTime="2026-03-08 05:49:16.36830658 +0000 UTC m=+1383.285955424" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.407520 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-wwbxz" podStartSLOduration=3.40750114 podStartE2EDuration="3.40750114s" podCreationTimestamp="2026-03-08 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:16.365642114 +0000 UTC m=+1383.283290958" watchObservedRunningTime="2026-03-08 05:49:16.40750114 +0000 UTC m=+1383.325149984" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436440 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436598 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttswk\" (UniqueName: \"kubernetes.io/projected/2ffc0380-502c-48b0-b36a-8421c5503fde-kube-api-access-ttswk\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.436971 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-logs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.437094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.538625 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.538966 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539092 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539374 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttswk\" (UniqueName: \"kubernetes.io/projected/2ffc0380-502c-48b0-b36a-8421c5503fde-kube-api-access-ttswk\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-logs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.539550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.541127 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.541358 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-logs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.541776 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ffc0380-502c-48b0-b36a-8421c5503fde-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.546865 4717 scope.go:117] "RemoveContainer" containerID="15e4e2f7f032554a9a7fb21fdd9fd66e43fba3ddb3d00d7f1611fe2fd4558691" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.547297 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.547793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.547879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.560925 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ffc0380-502c-48b0-b36a-8421c5503fde-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.561744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttswk\" (UniqueName: \"kubernetes.io/projected/2ffc0380-502c-48b0-b36a-8421c5503fde-kube-api-access-ttswk\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.592905 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2ffc0380-502c-48b0-b36a-8421c5503fde\") " pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.627368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.634038 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.656665 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.676830 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.678206 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.686306 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.689293 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.746929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58162927-f626-43d8-a792-507cf584db78-logs\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.747066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-config-data\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.747135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj7lx\" (UniqueName: \"kubernetes.io/projected/58162927-f626-43d8-a792-507cf584db78-kube-api-access-fj7lx\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.747189 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.849585 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-config-data\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.849946 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj7lx\" (UniqueName: \"kubernetes.io/projected/58162927-f626-43d8-a792-507cf584db78-kube-api-access-fj7lx\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.854897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.855065 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58162927-f626-43d8-a792-507cf584db78-logs\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.855603 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58162927-f626-43d8-a792-507cf584db78-logs\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.864101 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.866748 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj7lx\" (UniqueName: \"kubernetes.io/projected/58162927-f626-43d8-a792-507cf584db78-kube-api-access-fj7lx\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:16 crc kubenswrapper[4717]: I0308 05:49:16.880662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58162927-f626-43d8-a792-507cf584db78-config-data\") pod \"watcher-applier-0\" (UID: \"58162927-f626-43d8-a792-507cf584db78\") " pod="openstack/watcher-applier-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.027753 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.113052 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.169359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbv64\" (UniqueName: \"kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64\") pod \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.169418 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data\") pod \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.169502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs\") pod \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.169574 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca\") pod \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.169604 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle\") pod \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\" (UID: \"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.177706 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64" (OuterVolumeSpecName: "kube-api-access-xbv64") pod "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" (UID: "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a"). InnerVolumeSpecName "kube-api-access-xbv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.178121 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs" (OuterVolumeSpecName: "logs") pod "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" (UID: "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.218890 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" (UID: "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.229167 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" (UID: "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.270371 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data" (OuterVolumeSpecName: "config-data") pod "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" (UID: "f29ab6b7-97ec-4d9d-ba67-d4abad06de9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.271795 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbv64\" (UniqueName: \"kubernetes.io/projected/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-kube-api-access-xbv64\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.271827 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.271838 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.271846 4717 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.271854 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.342243 4717 generic.go:334] "Generic (PLEG): container finished" podID="75976964-499c-4d15-937e-4921f1b16150" containerID="8a0cb418f30515d9cca8f3fc5b3c8e605b366bb1382b7d5002e81b803047f17e" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.342298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwbxz" event={"ID":"75976964-499c-4d15-937e-4921f1b16150","Type":"ContainerDied","Data":"8a0cb418f30515d9cca8f3fc5b3c8e605b366bb1382b7d5002e81b803047f17e"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.345999 4717 generic.go:334] "Generic (PLEG): container finished" podID="e5702b49-8d23-42f8-a162-6783a3eb4a53" containerID="f721adaa689443e873da9f195a8ac787b708bb62e59ebc9429a0b488f16dec95" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.346047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpsxr" event={"ID":"e5702b49-8d23-42f8-a162-6783a3eb4a53","Type":"ContainerDied","Data":"f721adaa689443e873da9f195a8ac787b708bb62e59ebc9429a0b488f16dec95"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.349367 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"acde9c29-0910-40cc-9da8-06a566c67b4c","Type":"ContainerStarted","Data":"e6f2fdbc89feeb0b15e5eaba31fa864e9f25c0ba652ed71883576687a8ea3309"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.349393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"acde9c29-0910-40cc-9da8-06a566c67b4c","Type":"ContainerStarted","Data":"d983a03622063e773df62cb52258e6dd368f6464eae82c9f8dc22a7293952c52"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.349401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"acde9c29-0910-40cc-9da8-06a566c67b4c","Type":"ContainerStarted","Data":"e0c4c7e3a25df5a81da9562316c7c3fec14cd31c124d917f3e442d0b0a986047"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.350613 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.352825 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="acde9c29-0910-40cc-9da8-06a566c67b4c" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.205:9322/\": dial tcp 10.217.0.205:9322: connect: connection refused" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.354874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerStarted","Data":"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.354901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerStarted","Data":"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.366855 4717 generic.go:334] "Generic (PLEG): container finished" podID="413d3437-74ee-4793-9088-77fac53e4d7c" containerID="7e11b405b280a0e3dc60a90053e04e0d057e56cf6f6094e135d9311fe4f25982" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.366905 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" event={"ID":"413d3437-74ee-4793-9088-77fac53e4d7c","Type":"ContainerDied","Data":"7e11b405b280a0e3dc60a90053e04e0d057e56cf6f6094e135d9311fe4f25982"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.391290 4717 generic.go:334] "Generic (PLEG): container finished" podID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerID="b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.391848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerDied","Data":"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.391954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f29ab6b7-97ec-4d9d-ba67-d4abad06de9a","Type":"ContainerDied","Data":"b36ebddd35b1cbdbd3caf2444fbe06303ff15540d635bab9faab19b96f3aa94f"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.392042 4717 scope.go:117] "RemoveContainer" containerID="b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.392482 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.398889 4717 generic.go:334] "Generic (PLEG): container finished" podID="46ffbca0-f2a1-4c6b-8594-996db23783f2" containerID="a41a9e9098882e4e7f1fe94978bdef902eefc12293c186376c6466141d8b5ab8" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.398970 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" event={"ID":"46ffbca0-f2a1-4c6b-8594-996db23783f2","Type":"ContainerDied","Data":"a41a9e9098882e4e7f1fe94978bdef902eefc12293c186376c6466141d8b5ab8"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.410969 4717 generic.go:334] "Generic (PLEG): container finished" podID="2d25b2af-6a1d-4145-8627-5ba8338bcbef" containerID="18228d80281d2a60ffcf4c56d401c6cb7b582c8dabab1a6352f8c7e35f831ade" exitCode=0 Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.411286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30a0-account-create-update-7xzkx" event={"ID":"2d25b2af-6a1d-4145-8627-5ba8338bcbef","Type":"ContainerDied","Data":"18228d80281d2a60ffcf4c56d401c6cb7b582c8dabab1a6352f8c7e35f831ade"} Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.419681 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.442820 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.453152 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.453133299 podStartE2EDuration="2.453133299s" podCreationTimestamp="2026-03-08 05:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:17.393921474 +0000 UTC m=+1384.311570318" watchObservedRunningTime="2026-03-08 05:49:17.453133299 +0000 UTC m=+1384.370782143" Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.455908 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.501015 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.531516 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.548908 4717 scope.go:117] "RemoveContainer" containerID="b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.552784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.552865 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6\": container with ID starting with b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6 not found: ID does not exist" containerID="b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.552898 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6"} err="failed to get container status \"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6\": rpc error: code = NotFound desc = could not find container \"b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6\": container with ID starting with b459b007df5d288808adf271d997a5f8167100bc7f72ad9ba95ceb9e7c1a45c6 not found: ID does not exist" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.552920 4717 scope.go:117] "RemoveContainer" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.553199 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006\": container with ID starting with 11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006 not found: ID does not exist" containerID="11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.553235 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006"} err="failed to get container status \"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006\": rpc error: code = NotFound desc = could not find container \"11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006\": container with ID starting with 11b9ef40b1cd0baacc194b52d9b0c7ef5b7b6fd380054680b40711787dbc2006 not found: ID does not exist" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.562508 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.562950 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.562965 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.562977 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.562983 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: E0308 05:49:17.563017 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.563023 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.563205 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.563215 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.563856 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.568566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.574398 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.680964 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsln\" (UniqueName: \"kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.681261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.681343 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.681406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.681433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.761719 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.787988 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.788711 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.788803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.788930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsln\" (UniqueName: \"kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.789002 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.790544 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.796108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.796963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.803288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.810904 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsln\" (UniqueName: \"kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln\") pod \"watcher-decision-engine-0\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.820621 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6345fc76-e42d-4a13-90d2-c2bd5135f073" path="/var/lib/kubelet/pods/6345fc76-e42d-4a13-90d2-c2bd5135f073/volumes" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.821358 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be711ba-e0dc-4d84-a9d3-910819cc02e3" path="/var/lib/kubelet/pods/6be711ba-e0dc-4d84-a9d3-910819cc02e3/volumes" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.821912 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" path="/var/lib/kubelet/pods/f29ab6b7-97ec-4d9d-ba67-d4abad06de9a/volumes" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.890373 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwh76\" (UniqueName: \"kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76\") pod \"63ec8299-e288-40a1-882a-0980ef3b21d4\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.890439 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts\") pod \"63ec8299-e288-40a1-882a-0980ef3b21d4\" (UID: \"63ec8299-e288-40a1-882a-0980ef3b21d4\") " Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.891323 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63ec8299-e288-40a1-882a-0980ef3b21d4" (UID: "63ec8299-e288-40a1-882a-0980ef3b21d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.901931 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76" (OuterVolumeSpecName: "kube-api-access-vwh76") pod "63ec8299-e288-40a1-882a-0980ef3b21d4" (UID: "63ec8299-e288-40a1-882a-0980ef3b21d4"). InnerVolumeSpecName "kube-api-access-vwh76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.904595 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.993008 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwh76\" (UniqueName: \"kubernetes.io/projected/63ec8299-e288-40a1-882a-0980ef3b21d4-kube-api-access-vwh76\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:17 crc kubenswrapper[4717]: I0308 05:49:17.993032 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ec8299-e288-40a1-882a-0980ef3b21d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.136826 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-78c779987c-vpp7g" podUID="3b21c262-66aa-47df-ad60-24b7a43031a3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.179:9696/\": dial tcp 10.217.0.179:9696: i/o timeout" Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.440572 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.468547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"58162927-f626-43d8-a792-507cf584db78","Type":"ContainerStarted","Data":"34e7e2e748b3489555f1d4ec019c5487ac980c6b6ec889aeeb523472a895b0b5"} Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.468599 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"58162927-f626-43d8-a792-507cf584db78","Type":"ContainerStarted","Data":"418119bb9ee97da37cf7dd206e4ad23a33e660e02aa2a1751fa5a912390716b1"} Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.474067 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ffc0380-502c-48b0-b36a-8421c5503fde","Type":"ContainerStarted","Data":"e8da7b4da2d93946cdbd7b0391070669e8f646b42356071cd9f73e6d128b100d"} Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.474112 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ffc0380-502c-48b0-b36a-8421c5503fde","Type":"ContainerStarted","Data":"dea86b256a31a7a12579b4980ec05eb08b8a552843f197f476344d0c0ee0a4a4"} Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.477768 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lwr4v" Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.477881 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lwr4v" event={"ID":"63ec8299-e288-40a1-882a-0980ef3b21d4","Type":"ContainerDied","Data":"3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d"} Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.477916 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db5bbb02e6fc932643ecaf292c3f713aa79d4866763d07f6c38ad27f5764d4d" Mar 08 05:49:18 crc kubenswrapper[4717]: I0308 05:49:18.486863 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.486846034 podStartE2EDuration="2.486846034s" podCreationTimestamp="2026-03-08 05:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:18.486637019 +0000 UTC m=+1385.404285863" watchObservedRunningTime="2026-03-08 05:49:18.486846034 +0000 UTC m=+1385.404494878" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.000654 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.118902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts\") pod \"413d3437-74ee-4793-9088-77fac53e4d7c\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.119249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crhfg\" (UniqueName: \"kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg\") pod \"413d3437-74ee-4793-9088-77fac53e4d7c\" (UID: \"413d3437-74ee-4793-9088-77fac53e4d7c\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.120304 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "413d3437-74ee-4793-9088-77fac53e4d7c" (UID: "413d3437-74ee-4793-9088-77fac53e4d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.135881 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg" (OuterVolumeSpecName: "kube-api-access-crhfg") pod "413d3437-74ee-4793-9088-77fac53e4d7c" (UID: "413d3437-74ee-4793-9088-77fac53e4d7c"). InnerVolumeSpecName "kube-api-access-crhfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.221975 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413d3437-74ee-4793-9088-77fac53e4d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.222009 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crhfg\" (UniqueName: \"kubernetes.io/projected/413d3437-74ee-4793-9088-77fac53e4d7c-kube-api-access-crhfg\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.304277 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.336426 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.349046 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.361515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430559 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts\") pod \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430615 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44hlt\" (UniqueName: \"kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt\") pod \"46ffbca0-f2a1-4c6b-8594-996db23783f2\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430709 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdxw4\" (UniqueName: \"kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4\") pod \"e5702b49-8d23-42f8-a162-6783a3eb4a53\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts\") pod \"46ffbca0-f2a1-4c6b-8594-996db23783f2\" (UID: \"46ffbca0-f2a1-4c6b-8594-996db23783f2\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430851 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgxlt\" (UniqueName: \"kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt\") pod \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\" (UID: \"2d25b2af-6a1d-4145-8627-5ba8338bcbef\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430913 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts\") pod \"e5702b49-8d23-42f8-a162-6783a3eb4a53\" (UID: \"e5702b49-8d23-42f8-a162-6783a3eb4a53\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.430936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts\") pod \"75976964-499c-4d15-937e-4921f1b16150\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.431010 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l75g\" (UniqueName: \"kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g\") pod \"75976964-499c-4d15-937e-4921f1b16150\" (UID: \"75976964-499c-4d15-937e-4921f1b16150\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.433082 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46ffbca0-f2a1-4c6b-8594-996db23783f2" (UID: "46ffbca0-f2a1-4c6b-8594-996db23783f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.433476 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d25b2af-6a1d-4145-8627-5ba8338bcbef" (UID: "2d25b2af-6a1d-4145-8627-5ba8338bcbef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.438151 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5702b49-8d23-42f8-a162-6783a3eb4a53" (UID: "e5702b49-8d23-42f8-a162-6783a3eb4a53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.440602 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75976964-499c-4d15-937e-4921f1b16150" (UID: "75976964-499c-4d15-937e-4921f1b16150"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.460209 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g" (OuterVolumeSpecName: "kube-api-access-7l75g") pod "75976964-499c-4d15-937e-4921f1b16150" (UID: "75976964-499c-4d15-937e-4921f1b16150"). InnerVolumeSpecName "kube-api-access-7l75g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.502907 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt" (OuterVolumeSpecName: "kube-api-access-44hlt") pod "46ffbca0-f2a1-4c6b-8594-996db23783f2" (UID: "46ffbca0-f2a1-4c6b-8594-996db23783f2"). InnerVolumeSpecName "kube-api-access-44hlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.503887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jpsxr" event={"ID":"e5702b49-8d23-42f8-a162-6783a3eb4a53","Type":"ContainerDied","Data":"c444d6d4472469a4f92c47d131dbefc0a8e7ca0b25e6346ef7ae2e56784e9112"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.503931 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c444d6d4472469a4f92c47d131dbefc0a8e7ca0b25e6346ef7ae2e56784e9112" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.504007 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jpsxr" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.504227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4" (OuterVolumeSpecName: "kube-api-access-pdxw4") pod "e5702b49-8d23-42f8-a162-6783a3eb4a53" (UID: "e5702b49-8d23-42f8-a162-6783a3eb4a53"). InnerVolumeSpecName "kube-api-access-pdxw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.504281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt" (OuterVolumeSpecName: "kube-api-access-xgxlt") pod "2d25b2af-6a1d-4145-8627-5ba8338bcbef" (UID: "2d25b2af-6a1d-4145-8627-5ba8338bcbef"). InnerVolumeSpecName "kube-api-access-xgxlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.510664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" event={"ID":"413d3437-74ee-4793-9088-77fac53e4d7c","Type":"ContainerDied","Data":"6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.510717 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bf45a7ea2beac4d4b3cf71f3de1bf582bbca2f105d2937cd8c1bfdd9777c745" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.510785 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d06f-account-create-update-8vf24" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.514038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" event={"ID":"46ffbca0-f2a1-4c6b-8594-996db23783f2","Type":"ContainerDied","Data":"8e9cec116610b1b4e3fa6e7f9900c55869ecc326580a6b29a48cfae0ebda41f9"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.514077 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e9cec116610b1b4e3fa6e7f9900c55869ecc326580a6b29a48cfae0ebda41f9" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.514153 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef6d-account-create-update-zrg6l" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.520702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30a0-account-create-update-7xzkx" event={"ID":"2d25b2af-6a1d-4145-8627-5ba8338bcbef","Type":"ContainerDied","Data":"8b10fdc4fded854bf6bc11e3bb3a2e70b661154b9e0668806e0ee4d46c966904"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.520737 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b10fdc4fded854bf6bc11e3bb3a2e70b661154b9e0668806e0ee4d46c966904" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.520813 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30a0-account-create-update-7xzkx" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533425 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l75g\" (UniqueName: \"kubernetes.io/projected/75976964-499c-4d15-937e-4921f1b16150-kube-api-access-7l75g\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533448 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d25b2af-6a1d-4145-8627-5ba8338bcbef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533458 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44hlt\" (UniqueName: \"kubernetes.io/projected/46ffbca0-f2a1-4c6b-8594-996db23783f2-kube-api-access-44hlt\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533466 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdxw4\" (UniqueName: \"kubernetes.io/projected/e5702b49-8d23-42f8-a162-6783a3eb4a53-kube-api-access-pdxw4\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533475 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46ffbca0-f2a1-4c6b-8594-996db23783f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533521 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgxlt\" (UniqueName: \"kubernetes.io/projected/2d25b2af-6a1d-4145-8627-5ba8338bcbef-kube-api-access-xgxlt\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533529 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5702b49-8d23-42f8-a162-6783a3eb4a53-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.533538 4717 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75976964-499c-4d15-937e-4921f1b16150-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.564443 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerID="b6b164a88de189fae0dae7303712cb2904ac004f920d737bdf1c3d765a634eca" exitCode=0 Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.564546 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerDied","Data":"b6b164a88de189fae0dae7303712cb2904ac004f920d737bdf1c3d765a634eca"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.574288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d6b6787-6f7e-42d4-ac9b-6804c46381b1","Type":"ContainerStarted","Data":"8a68d3cf3d2c08d1afd55e3dddbc8f3a914b835c9c186478556ad8d7bc0ab84c"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.574331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d6b6787-6f7e-42d4-ac9b-6804c46381b1","Type":"ContainerStarted","Data":"0b2a7c718070addb5baa3d4a50f0619d7078c0ecebb4354af3a072918179472f"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.597896 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.5978802009999997 podStartE2EDuration="2.597880201s" podCreationTimestamp="2026-03-08 05:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:19.597221385 +0000 UTC m=+1386.514870239" watchObservedRunningTime="2026-03-08 05:49:19.597880201 +0000 UTC m=+1386.515529045" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.618963 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wwbxz" event={"ID":"75976964-499c-4d15-937e-4921f1b16150","Type":"ContainerDied","Data":"12ea192639a4903fc0696b6cfb8681539d6c096326c9cd5181714dd66fbc0a67"} Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.619020 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ea192639a4903fc0696b6cfb8681539d6c096326c9cd5181714dd66fbc0a67" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.619051 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wwbxz" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.881520 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.950412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config\") pod \"f9407de8-78ab-4bf1-9f53-49e71656898a\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.950468 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zjwp\" (UniqueName: \"kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp\") pod \"f9407de8-78ab-4bf1-9f53-49e71656898a\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.950499 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config\") pod \"f9407de8-78ab-4bf1-9f53-49e71656898a\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.950648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs\") pod \"f9407de8-78ab-4bf1-9f53-49e71656898a\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.950752 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle\") pod \"f9407de8-78ab-4bf1-9f53-49e71656898a\" (UID: \"f9407de8-78ab-4bf1-9f53-49e71656898a\") " Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.975142 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp" (OuterVolumeSpecName: "kube-api-access-6zjwp") pod "f9407de8-78ab-4bf1-9f53-49e71656898a" (UID: "f9407de8-78ab-4bf1-9f53-49e71656898a"). InnerVolumeSpecName "kube-api-access-6zjwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:19 crc kubenswrapper[4717]: I0308 05:49:19.976279 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f9407de8-78ab-4bf1-9f53-49e71656898a" (UID: "f9407de8-78ab-4bf1-9f53-49e71656898a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.047291 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config" (OuterVolumeSpecName: "config") pod "f9407de8-78ab-4bf1-9f53-49e71656898a" (UID: "f9407de8-78ab-4bf1-9f53-49e71656898a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.052703 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.052736 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zjwp\" (UniqueName: \"kubernetes.io/projected/f9407de8-78ab-4bf1-9f53-49e71656898a-kube-api-access-6zjwp\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.052749 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.087803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9407de8-78ab-4bf1-9f53-49e71656898a" (UID: "f9407de8-78ab-4bf1-9f53-49e71656898a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.100759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f9407de8-78ab-4bf1-9f53-49e71656898a" (UID: "f9407de8-78ab-4bf1-9f53-49e71656898a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.155080 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.155286 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9407de8-78ab-4bf1-9f53-49e71656898a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.629773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ffc0380-502c-48b0-b36a-8421c5503fde","Type":"ContainerStarted","Data":"57d9833aa8800f94ac2d4e10f35506c04bc36baf418ab4ce64ab82fdae405f9b"} Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerStarted","Data":"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd"} Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632634 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-central-agent" containerID="cri-o://8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38" gracePeriod=30 Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632666 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="sg-core" containerID="cri-o://00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89" gracePeriod=30 Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632717 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="proxy-httpd" containerID="cri-o://e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd" gracePeriod=30 Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632656 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.632763 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-notification-agent" containerID="cri-o://b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096" gracePeriod=30 Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.639255 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.639327 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6c6f4-sgbls" event={"ID":"f9407de8-78ab-4bf1-9f53-49e71656898a","Type":"ContainerDied","Data":"0f6b2f069f6cf7065d1f8c91351da964b1870e6dad2b1c91239a0be75c31e9de"} Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.639361 4717 scope.go:117] "RemoveContainer" containerID="3e62a802a83a0fd7cc70d85b5b7b1816de1c84f06b89d893888a3028192bccb3" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.639367 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.639546 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6c6f4-sgbls" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.667149 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.667134195 podStartE2EDuration="4.667134195s" podCreationTimestamp="2026-03-08 05:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:20.663661429 +0000 UTC m=+1387.581310263" watchObservedRunningTime="2026-03-08 05:49:20.667134195 +0000 UTC m=+1387.584783039" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.675818 4717 scope.go:117] "RemoveContainer" containerID="b6b164a88de189fae0dae7303712cb2904ac004f920d737bdf1c3d765a634eca" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.709124 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.722414 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.723992 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c49bc6878-t8tg8" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.725614 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6fd6c6f4-sgbls"] Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.729147 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.776629892 podStartE2EDuration="14.729130049s" podCreationTimestamp="2026-03-08 05:49:06 +0000 UTC" firstStartedPulling="2026-03-08 05:49:14.133834439 +0000 UTC m=+1381.051483283" lastFinishedPulling="2026-03-08 05:49:19.086334596 +0000 UTC m=+1386.003983440" observedRunningTime="2026-03-08 05:49:20.704048028 +0000 UTC m=+1387.621696872" watchObservedRunningTime="2026-03-08 05:49:20.729130049 +0000 UTC m=+1387.646778893" Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.827318 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.827574 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7755f67488-mclxw" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-log" containerID="cri-o://e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6" gracePeriod=30 Mar 08 05:49:20 crc kubenswrapper[4717]: I0308 05:49:20.828039 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7755f67488-mclxw" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-api" containerID="cri-o://32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f" gracePeriod=30 Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.649611 4717 generic.go:334] "Generic (PLEG): container finished" podID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerID="e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6" exitCode=143 Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.649709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerDied","Data":"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6"} Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652851 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerID="e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd" exitCode=0 Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652872 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerID="00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89" exitCode=2 Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652879 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerID="b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096" exitCode=0 Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerDied","Data":"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd"} Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerDied","Data":"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89"} Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.652952 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerDied","Data":"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096"} Mar 08 05:49:21 crc kubenswrapper[4717]: I0308 05:49:21.798075 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" path="/var/lib/kubelet/pods/f9407de8-78ab-4bf1-9f53-49e71656898a/volumes" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.029178 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.133045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202481 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202555 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psf4r\" (UniqueName: \"kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202785 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.202866 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data\") pod \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\" (UID: \"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7\") " Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.210654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r" (OuterVolumeSpecName: "kube-api-access-psf4r") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "kube-api-access-psf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.213352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.219595 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs" (OuterVolumeSpecName: "logs") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.227882 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts" (OuterVolumeSpecName: "scripts") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.300830 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data" (OuterVolumeSpecName: "config-data") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.305589 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psf4r\" (UniqueName: \"kubernetes.io/projected/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-kube-api-access-psf4r\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.305620 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.305630 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.305640 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.327860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.362833 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.362991 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" (UID: "6ea480f2-2a72-4d1b-aa35-3b4b25b972b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.406995 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.407029 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.407038 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.471705 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.471947 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-log" containerID="cri-o://c1917f2759c86ee9721919a30a060871204b566f474f611a312fb765bec38dd4" gracePeriod=30 Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.472026 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-httpd" containerID="cri-o://7f16a619842332821727a9d6674b333242122ffe323c117935455e4b11b303a9" gracePeriod=30 Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.668006 4717 generic.go:334] "Generic (PLEG): container finished" podID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerID="32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f" exitCode=0 Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.668124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerDied","Data":"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f"} Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.668187 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7755f67488-mclxw" event={"ID":"6ea480f2-2a72-4d1b-aa35-3b4b25b972b7","Type":"ContainerDied","Data":"6de8c89c19544c253cd53840e8f6e5ed0b5a42c35f1ea539971f1d57e8f856b4"} Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.668207 4717 scope.go:117] "RemoveContainer" containerID="32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.669502 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7755f67488-mclxw" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.671623 4717 generic.go:334] "Generic (PLEG): container finished" podID="23b12879-34d7-47df-a056-6caccf7dec10" containerID="c1917f2759c86ee9721919a30a060871204b566f474f611a312fb765bec38dd4" exitCode=143 Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.671663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerDied","Data":"c1917f2759c86ee9721919a30a060871204b566f474f611a312fb765bec38dd4"} Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.703594 4717 scope.go:117] "RemoveContainer" containerID="e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.713201 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.725257 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7755f67488-mclxw"] Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.728028 4717 scope.go:117] "RemoveContainer" containerID="32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f" Mar 08 05:49:22 crc kubenswrapper[4717]: E0308 05:49:22.728421 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f\": container with ID starting with 32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f not found: ID does not exist" containerID="32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.728451 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f"} err="failed to get container status \"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f\": rpc error: code = NotFound desc = could not find container \"32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f\": container with ID starting with 32ed083e15fce3eb464c94b222ba17884f39d63c59b08cb61a48446445261f1f not found: ID does not exist" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.728470 4717 scope.go:117] "RemoveContainer" containerID="e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6" Mar 08 05:49:22 crc kubenswrapper[4717]: E0308 05:49:22.730054 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6\": container with ID starting with e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6 not found: ID does not exist" containerID="e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6" Mar 08 05:49:22 crc kubenswrapper[4717]: I0308 05:49:22.730965 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6"} err="failed to get container status \"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6\": rpc error: code = NotFound desc = could not find container \"e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6\": container with ID starting with e0c56b8d82fe47253d24f5a10f6d043563aa352443f061dac954f85043c86cf6 not found: ID does not exist" Mar 08 05:49:23 crc kubenswrapper[4717]: I0308 05:49:23.689085 4717 generic.go:334] "Generic (PLEG): container finished" podID="23b12879-34d7-47df-a056-6caccf7dec10" containerID="7f16a619842332821727a9d6674b333242122ffe323c117935455e4b11b303a9" exitCode=0 Mar 08 05:49:23 crc kubenswrapper[4717]: I0308 05:49:23.689263 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerDied","Data":"7f16a619842332821727a9d6674b333242122ffe323c117935455e4b11b303a9"} Mar 08 05:49:23 crc kubenswrapper[4717]: I0308 05:49:23.803278 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" path="/var/lib/kubelet/pods/6ea480f2-2a72-4d1b-aa35-3b4b25b972b7/volumes" Mar 08 05:49:23 crc kubenswrapper[4717]: I0308 05:49:23.846974 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035410 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035438 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035483 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035524 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.035653 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52s48\" (UniqueName: \"kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48\") pod \"23b12879-34d7-47df-a056-6caccf7dec10\" (UID: \"23b12879-34d7-47df-a056-6caccf7dec10\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.044542 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48" (OuterVolumeSpecName: "kube-api-access-52s48") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "kube-api-access-52s48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.044859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs" (OuterVolumeSpecName: "logs") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.045182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.076851 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.110921 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts" (OuterVolumeSpecName: "scripts") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.147079 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.147130 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.147141 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.147151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52s48\" (UniqueName: \"kubernetes.io/projected/23b12879-34d7-47df-a056-6caccf7dec10-kube-api-access-52s48\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.147161 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23b12879-34d7-47df-a056-6caccf7dec10-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.173372 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.207508 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.214343 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.215576 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data" (OuterVolumeSpecName: "config-data") pod "23b12879-34d7-47df-a056-6caccf7dec10" (UID: "23b12879-34d7-47df-a056-6caccf7dec10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.254991 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.255030 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.255044 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b12879-34d7-47df-a056-6caccf7dec10-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.255056 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.321104 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343221 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x6mmq"] Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343649 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-api" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343665 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-api" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343674 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ec8299-e288-40a1-882a-0980ef3b21d4" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343720 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ec8299-e288-40a1-882a-0980ef3b21d4" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343737 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343743 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343759 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-log" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343764 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-log" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343778 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75976964-499c-4d15-937e-4921f1b16150" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343784 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="75976964-499c-4d15-937e-4921f1b16150" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343800 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343806 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343816 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413d3437-74ee-4793-9088-77fac53e4d7c" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343822 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="413d3437-74ee-4793-9088-77fac53e4d7c" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343832 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ffbca0-f2a1-4c6b-8594-996db23783f2" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343837 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ffbca0-f2a1-4c6b-8594-996db23783f2" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343845 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343850 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343862 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="sg-core" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343867 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="sg-core" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343877 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d25b2af-6a1d-4145-8627-5ba8338bcbef" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343883 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d25b2af-6a1d-4145-8627-5ba8338bcbef" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343890 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5702b49-8d23-42f8-a162-6783a3eb4a53" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343897 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5702b49-8d23-42f8-a162-6783a3eb4a53" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343905 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="proxy-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343911 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="proxy-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343922 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-notification-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343928 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-notification-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343942 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-log" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343948 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-log" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343958 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-central-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343963 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-central-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.343975 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-api" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.343981 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-api" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344136 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5702b49-8d23-42f8-a162-6783a3eb4a53" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344155 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="sg-core" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344162 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-notification-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344170 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-log" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344177 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344183 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="75976964-499c-4d15-937e-4921f1b16150" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344192 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29ab6b7-97ec-4d9d-ba67-d4abad06de9a" containerName="watcher-decision-engine" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344201 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ffbca0-f2a1-4c6b-8594-996db23783f2" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344212 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="ceilometer-central-agent" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344220 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344228 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344238 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea480f2-2a72-4d1b-aa35-3b4b25b972b7" containerName="placement-api" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344247 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ec8299-e288-40a1-882a-0980ef3b21d4" containerName="mariadb-database-create" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344259 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d25b2af-6a1d-4145-8627-5ba8338bcbef" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344268 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9407de8-78ab-4bf1-9f53-49e71656898a" containerName="neutron-api" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344278 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerName="proxy-httpd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344286 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b12879-34d7-47df-a056-6caccf7dec10" containerName="glance-log" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344293 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="413d3437-74ee-4793-9088-77fac53e4d7c" containerName="mariadb-account-create-update" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.344972 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.347402 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2r54d" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.347642 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.347778 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.384582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x6mmq"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459508 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459552 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459627 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwvj\" (UniqueName: \"kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459774 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.459866 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml\") pod \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\" (UID: \"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f\") " Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.460200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.460246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5sg\" (UniqueName: \"kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.460315 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.461013 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.460331 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.460738 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.466074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts" (OuterVolumeSpecName: "scripts") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.466709 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj" (OuterVolumeSpecName: "kube-api-access-7rwvj") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "kube-api-access-7rwvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.498425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562385 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562442 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z5sg\" (UniqueName: \"kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562535 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562546 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562555 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562564 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwvj\" (UniqueName: \"kubernetes.io/projected/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-kube-api-access-7rwvj\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.562572 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.576280 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.576288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.576737 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.579192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.585139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z5sg\" (UniqueName: \"kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg\") pod \"nova-cell0-conductor-db-sync-x6mmq\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.605572 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data" (OuterVolumeSpecName: "config-data") pod "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" (UID: "2e05fd1b-de0d-4720-afc6-0f6fbd89e93f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.662400 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.664112 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.664135 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.707885 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" containerID="8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38" exitCode=0 Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.708393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerDied","Data":"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38"} Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.708940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e05fd1b-de0d-4720-afc6-0f6fbd89e93f","Type":"ContainerDied","Data":"077aa7c9feb4b99c76efa4485f286cda23e577f6515f0f7d1afea5bbcddb0e5b"} Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.708490 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.708994 4717 scope.go:117] "RemoveContainer" containerID="e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.716803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"23b12879-34d7-47df-a056-6caccf7dec10","Type":"ContainerDied","Data":"290261224c0ce51d3ca223683b9a0c4fd3f00efb6e784f88a4489b513857d02c"} Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.716866 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.747968 4717 scope.go:117] "RemoveContainer" containerID="00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.750295 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.771095 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.782823 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.805616 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.819149 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.821546 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.823980 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.824751 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.854652 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.856241 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.859250 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.859419 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.872443 4717 scope.go:117] "RemoveContainer" containerID="b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.881114 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.911342 4717 scope.go:117] "RemoveContainer" containerID="8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.915330 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.965029 4717 scope.go:117] "RemoveContainer" containerID="e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.965846 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd\": container with ID starting with e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd not found: ID does not exist" containerID="e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.965881 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd"} err="failed to get container status \"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd\": rpc error: code = NotFound desc = could not find container \"e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd\": container with ID starting with e2ceecaf23a498e31fb7e3dd448615e249724603aac287ebbb8530cdb21359bd not found: ID does not exist" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.965904 4717 scope.go:117] "RemoveContainer" containerID="00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.966185 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89\": container with ID starting with 00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89 not found: ID does not exist" containerID="00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966201 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89"} err="failed to get container status \"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89\": rpc error: code = NotFound desc = could not find container \"00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89\": container with ID starting with 00100656bdd7cfed942d26488365bbb15d7531fe5b303f3e59736e2316043f89 not found: ID does not exist" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966213 4717 scope.go:117] "RemoveContainer" containerID="b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.966389 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096\": container with ID starting with b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096 not found: ID does not exist" containerID="b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966403 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096"} err="failed to get container status \"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096\": rpc error: code = NotFound desc = could not find container \"b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096\": container with ID starting with b6773702dd9ae872d8920932eeeab7621918d0ac47b89e32211fc80108545096 not found: ID does not exist" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966414 4717 scope.go:117] "RemoveContainer" containerID="8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38" Mar 08 05:49:24 crc kubenswrapper[4717]: E0308 05:49:24.966592 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38\": container with ID starting with 8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38 not found: ID does not exist" containerID="8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966608 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38"} err="failed to get container status \"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38\": rpc error: code = NotFound desc = could not find container \"8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38\": container with ID starting with 8b2a94202e1d48bc13bbadbc8128b252c1ff1246df6ea61474a87159c7326c38 not found: ID does not exist" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.966621 4717 scope.go:117] "RemoveContainer" containerID="7f16a619842332821727a9d6674b333242122ffe323c117935455e4b11b303a9" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.989482 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.989575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvsf\" (UniqueName: \"kubernetes.io/projected/c715384f-21a1-490a-9432-1fef4658f5bd-kube-api-access-kdvsf\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.989608 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.989661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.989707 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990223 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990281 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990509 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqb94\" (UniqueName: \"kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.990811 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:24 crc kubenswrapper[4717]: I0308 05:49:24.999171 4717 scope.go:117] "RemoveContainer" containerID="c1917f2759c86ee9721919a30a060871204b566f474f611a312fb765bec38dd4" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093077 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093606 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.093661 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094079 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094114 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqb94\" (UniqueName: \"kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094533 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094654 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c715384f-21a1-490a-9432-1fef4658f5bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvsf\" (UniqueName: \"kubernetes.io/projected/c715384f-21a1-490a-9432-1fef4658f5bd-kube-api-access-kdvsf\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.094713 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.095039 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.095299 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.098185 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.098498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.098573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.099760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.100807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.104335 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.107043 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.108164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c715384f-21a1-490a-9432-1fef4658f5bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.124260 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvsf\" (UniqueName: \"kubernetes.io/projected/c715384f-21a1-490a-9432-1fef4658f5bd-kube-api-access-kdvsf\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.133472 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:25 crc kubenswrapper[4717]: E0308 05:49:25.134178 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bqb94], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="086af170-a3a2-4923-9340-e9595b37fcf6" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.136412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqb94\" (UniqueName: \"kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94\") pod \"ceilometer-0\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.152143 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c715384f-21a1-490a-9432-1fef4658f5bd\") " pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.180609 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.243090 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x6mmq"] Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.639730 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.654342 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.732826 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.741474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" event={"ID":"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4","Type":"ContainerStarted","Data":"631d46ece090d6c055b3b03af82712987fe4d442315659db81adfb5562d8a097"} Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.745805 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.753036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.764831 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.808658 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b12879-34d7-47df-a056-6caccf7dec10" path="/var/lib/kubelet/pods/23b12879-34d7-47df-a056-6caccf7dec10/volumes" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.809496 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e05fd1b-de0d-4720-afc6-0f6fbd89e93f" path="/var/lib/kubelet/pods/2e05fd1b-de0d-4720-afc6-0f6fbd89e93f/volumes" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.912934 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.912992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.913067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.913086 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.913958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.913986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqb94\" (UniqueName: \"kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.914011 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle\") pod \"086af170-a3a2-4923-9340-e9595b37fcf6\" (UID: \"086af170-a3a2-4923-9340-e9595b37fcf6\") " Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.914233 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.914501 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.915322 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.915336 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/086af170-a3a2-4923-9340-e9595b37fcf6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.921422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data" (OuterVolumeSpecName: "config-data") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.921606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.923540 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.924305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts" (OuterVolumeSpecName: "scripts") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:25 crc kubenswrapper[4717]: I0308 05:49:25.929046 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94" (OuterVolumeSpecName: "kube-api-access-bqb94") pod "086af170-a3a2-4923-9340-e9595b37fcf6" (UID: "086af170-a3a2-4923-9340-e9595b37fcf6"). InnerVolumeSpecName "kube-api-access-bqb94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.016782 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.017037 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.017111 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqb94\" (UniqueName: \"kubernetes.io/projected/086af170-a3a2-4923-9340-e9595b37fcf6-kube-api-access-bqb94\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.017170 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.017232 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/086af170-a3a2-4923-9340-e9595b37fcf6-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.628298 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.628592 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.691719 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.692278 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.760399 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.760666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c715384f-21a1-490a-9432-1fef4658f5bd","Type":"ContainerStarted","Data":"c3c50e41f7f70e80be646ff13a5d11d3006046e036feed7385411fa6a6789d14"} Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.760716 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c715384f-21a1-490a-9432-1fef4658f5bd","Type":"ContainerStarted","Data":"ca2931d31507213e8396e0f6d0500dbd6bbc7aa3e18ccd8f71174916363dbf64"} Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.761613 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.761635 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.824480 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.831805 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.837822 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.840025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.843327 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.843446 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.850804 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934298 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934573 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934747 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934773 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhrd9\" (UniqueName: \"kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934810 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:26 crc kubenswrapper[4717]: I0308 05:49:26.934967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.029145 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.035943 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036001 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhrd9\" (UniqueName: \"kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036144 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.036562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.037287 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.040240 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.040360 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.041467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.041939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.054866 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhrd9\" (UniqueName: \"kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9\") pod \"ceilometer-0\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.060124 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.211826 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.676741 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:27 crc kubenswrapper[4717]: W0308 05:49:27.694291 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf1e1153_7406_40e6_bbf3_4d48e1f5055c.slice/crio-2afb48053ee31e2b4e21c994aca5d347ca949d7d1181f76b719d5931d8d872d7 WatchSource:0}: Error finding container 2afb48053ee31e2b4e21c994aca5d347ca949d7d1181f76b719d5931d8d872d7: Status 404 returned error can't find the container with id 2afb48053ee31e2b4e21c994aca5d347ca949d7d1181f76b719d5931d8d872d7 Mar 08 05:49:27 crc kubenswrapper[4717]: E0308 05:49:27.756300 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.797639 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086af170-a3a2-4923-9340-e9595b37fcf6" path="/var/lib/kubelet/pods/086af170-a3a2-4923-9340-e9595b37fcf6/volumes" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.798208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerStarted","Data":"2afb48053ee31e2b4e21c994aca5d347ca949d7d1181f76b719d5931d8d872d7"} Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.798244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c715384f-21a1-490a-9432-1fef4658f5bd","Type":"ContainerStarted","Data":"531627e64a507aa98a93412e20f167c7ad458027c5cb778700b65b373c6e474b"} Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.814497 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.814478473 podStartE2EDuration="3.814478473s" podCreationTimestamp="2026-03-08 05:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:27.811879819 +0000 UTC m=+1394.729528673" watchObservedRunningTime="2026-03-08 05:49:27.814478473 +0000 UTC m=+1394.732127317" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.830721 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.906172 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:27 crc kubenswrapper[4717]: I0308 05:49:27.975540 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.689972 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.693904 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.810339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerStarted","Data":"27e4dc0f1597cc6f120537d22a1b9a6525bebb4e9da3de169766946a735211ed"} Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.810837 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerStarted","Data":"0add4af343d137d26183d105d00e179f0852cbbbac3c3042dda2cc8e80de3572"} Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.811291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:28 crc kubenswrapper[4717]: I0308 05:49:28.852929 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:29 crc kubenswrapper[4717]: I0308 05:49:29.822182 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerStarted","Data":"ecaf4e2cf35a31084cc0f8600592fe6177c01a5d87d179c291045f3515b56c21"} Mar 08 05:49:31 crc kubenswrapper[4717]: I0308 05:49:31.861711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerStarted","Data":"728cf41f673161df30febf652dec117ca99cc0cdec84230f3bee2563104cdd7f"} Mar 08 05:49:31 crc kubenswrapper[4717]: I0308 05:49:31.862356 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:49:31 crc kubenswrapper[4717]: I0308 05:49:31.888085 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.741862117 podStartE2EDuration="5.888066806s" podCreationTimestamp="2026-03-08 05:49:26 +0000 UTC" firstStartedPulling="2026-03-08 05:49:27.697060198 +0000 UTC m=+1394.614709042" lastFinishedPulling="2026-03-08 05:49:30.843264877 +0000 UTC m=+1397.760913731" observedRunningTime="2026-03-08 05:49:31.886843696 +0000 UTC m=+1398.804492560" watchObservedRunningTime="2026-03-08 05:49:31.888066806 +0000 UTC m=+1398.805715650" Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.192029 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.192219 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" containerName="watcher-decision-engine" containerID="cri-o://8a68d3cf3d2c08d1afd55e3dddbc8f3a914b835c9c186478556ad8d7bc0ab84c" gracePeriod=30 Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.855150 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.883596 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-central-agent" containerID="cri-o://0add4af343d137d26183d105d00e179f0852cbbbac3c3042dda2cc8e80de3572" gracePeriod=30 Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.883698 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="proxy-httpd" containerID="cri-o://728cf41f673161df30febf652dec117ca99cc0cdec84230f3bee2563104cdd7f" gracePeriod=30 Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.883747 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-notification-agent" containerID="cri-o://27e4dc0f1597cc6f120537d22a1b9a6525bebb4e9da3de169766946a735211ed" gracePeriod=30 Mar 08 05:49:33 crc kubenswrapper[4717]: I0308 05:49:33.883770 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="sg-core" containerID="cri-o://ecaf4e2cf35a31084cc0f8600592fe6177c01a5d87d179c291045f3515b56c21" gracePeriod=30 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.120070 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.120123 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.120162 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.120808 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.120861 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e" gracePeriod=600 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920220 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerID="728cf41f673161df30febf652dec117ca99cc0cdec84230f3bee2563104cdd7f" exitCode=0 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920487 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerID="ecaf4e2cf35a31084cc0f8600592fe6177c01a5d87d179c291045f3515b56c21" exitCode=2 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920496 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerID="27e4dc0f1597cc6f120537d22a1b9a6525bebb4e9da3de169766946a735211ed" exitCode=0 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920503 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerID="0add4af343d137d26183d105d00e179f0852cbbbac3c3042dda2cc8e80de3572" exitCode=0 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920306 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerDied","Data":"728cf41f673161df30febf652dec117ca99cc0cdec84230f3bee2563104cdd7f"} Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920587 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerDied","Data":"ecaf4e2cf35a31084cc0f8600592fe6177c01a5d87d179c291045f3515b56c21"} Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerDied","Data":"27e4dc0f1597cc6f120537d22a1b9a6525bebb4e9da3de169766946a735211ed"} Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.920615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerDied","Data":"0add4af343d137d26183d105d00e179f0852cbbbac3c3042dda2cc8e80de3572"} Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.923803 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e" exitCode=0 Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.923846 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e"} Mar 08 05:49:34 crc kubenswrapper[4717]: I0308 05:49:34.923881 4717 scope.go:117] "RemoveContainer" containerID="14c69fb7e16b1586e83c7b94c4423a6de420e911261ae096ef8585ebcd99c77b" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.180932 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.180987 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.222226 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.224707 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.935246 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:35 crc kubenswrapper[4717]: I0308 05:49:35.935291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:37 crc kubenswrapper[4717]: I0308 05:49:37.714014 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:37 crc kubenswrapper[4717]: I0308 05:49:37.716055 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 05:49:37 crc kubenswrapper[4717]: I0308 05:49:37.955760 4717 generic.go:334] "Generic (PLEG): container finished" podID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" containerID="8a68d3cf3d2c08d1afd55e3dddbc8f3a914b835c9c186478556ad8d7bc0ab84c" exitCode=0 Mar 08 05:49:37 crc kubenswrapper[4717]: I0308 05:49:37.956005 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d6b6787-6f7e-42d4-ac9b-6804c46381b1","Type":"ContainerDied","Data":"8a68d3cf3d2c08d1afd55e3dddbc8f3a914b835c9c186478556ad8d7bc0ab84c"} Mar 08 05:49:37 crc kubenswrapper[4717]: E0308 05:49:37.974809 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.535001 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.545769 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678497 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678540 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs\") pod \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678562 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678637 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678756 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhrd9\" (UniqueName: \"kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbsln\" (UniqueName: \"kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln\") pod \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data\") pod \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.678969 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.679022 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.679047 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd\") pod \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\" (UID: \"cf1e1153-7406-40e6-bbf3-4d48e1f5055c\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.679064 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca\") pod \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.679081 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle\") pod \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\" (UID: \"3d6b6787-6f7e-42d4-ac9b-6804c46381b1\") " Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.679834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.685265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.685571 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs" (OuterVolumeSpecName: "logs") pod "3d6b6787-6f7e-42d4-ac9b-6804c46381b1" (UID: "3d6b6787-6f7e-42d4-ac9b-6804c46381b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.690745 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln" (OuterVolumeSpecName: "kube-api-access-sbsln") pod "3d6b6787-6f7e-42d4-ac9b-6804c46381b1" (UID: "3d6b6787-6f7e-42d4-ac9b-6804c46381b1"). InnerVolumeSpecName "kube-api-access-sbsln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.694806 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts" (OuterVolumeSpecName: "scripts") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.707993 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9" (OuterVolumeSpecName: "kube-api-access-hhrd9") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "kube-api-access-hhrd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.747967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.753113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3d6b6787-6f7e-42d4-ac9b-6804c46381b1" (UID: "3d6b6787-6f7e-42d4-ac9b-6804c46381b1"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.759162 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6b6787-6f7e-42d4-ac9b-6804c46381b1" (UID: "3d6b6787-6f7e-42d4-ac9b-6804c46381b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782391 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782416 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782425 4717 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782435 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782442 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782451 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782458 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782466 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhrd9\" (UniqueName: \"kubernetes.io/projected/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-kube-api-access-hhrd9\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.782476 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbsln\" (UniqueName: \"kubernetes.io/projected/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-kube-api-access-sbsln\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.783848 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data" (OuterVolumeSpecName: "config-data") pod "3d6b6787-6f7e-42d4-ac9b-6804c46381b1" (UID: "3d6b6787-6f7e-42d4-ac9b-6804c46381b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.818806 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.833852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data" (OuterVolumeSpecName: "config-data") pod "cf1e1153-7406-40e6-bbf3-4d48e1f5055c" (UID: "cf1e1153-7406-40e6-bbf3-4d48e1f5055c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.884159 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6b6787-6f7e-42d4-ac9b-6804c46381b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.884186 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.884196 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e1153-7406-40e6-bbf3-4d48e1f5055c-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.969840 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b"} Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.977820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" event={"ID":"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4","Type":"ContainerStarted","Data":"97368cb953e6d8fae4e88a61a4a2fd4420986ca65ff922b3047eab800868691f"} Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.996728 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3d6b6787-6f7e-42d4-ac9b-6804c46381b1","Type":"ContainerDied","Data":"0b2a7c718070addb5baa3d4a50f0619d7078c0ecebb4354af3a072918179472f"} Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.996861 4717 scope.go:117] "RemoveContainer" containerID="8a68d3cf3d2c08d1afd55e3dddbc8f3a914b835c9c186478556ad8d7bc0ab84c" Mar 08 05:49:38 crc kubenswrapper[4717]: I0308 05:49:38.996797 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.011872 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.012392 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf1e1153-7406-40e6-bbf3-4d48e1f5055c","Type":"ContainerDied","Data":"2afb48053ee31e2b4e21c994aca5d347ca949d7d1181f76b719d5931d8d872d7"} Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.020951 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" podStartSLOduration=2.067213756 podStartE2EDuration="15.020935857s" podCreationTimestamp="2026-03-08 05:49:24 +0000 UTC" firstStartedPulling="2026-03-08 05:49:25.255277158 +0000 UTC m=+1392.172926002" lastFinishedPulling="2026-03-08 05:49:38.208999259 +0000 UTC m=+1405.126648103" observedRunningTime="2026-03-08 05:49:39.00447192 +0000 UTC m=+1405.922120764" watchObservedRunningTime="2026-03-08 05:49:39.020935857 +0000 UTC m=+1405.938584701" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.081928 4717 scope.go:117] "RemoveContainer" containerID="728cf41f673161df30febf652dec117ca99cc0cdec84230f3bee2563104cdd7f" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.113771 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.141185 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.153731 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.166061 4717 scope.go:117] "RemoveContainer" containerID="ecaf4e2cf35a31084cc0f8600592fe6177c01a5d87d179c291045f3515b56c21" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.169747 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.191733 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: E0308 05:49:39.192103 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-central-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192114 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-central-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: E0308 05:49:39.192136 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="proxy-httpd" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192142 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="proxy-httpd" Mar 08 05:49:39 crc kubenswrapper[4717]: E0308 05:49:39.192156 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-notification-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192163 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-notification-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: E0308 05:49:39.192169 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" containerName="watcher-decision-engine" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192176 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" containerName="watcher-decision-engine" Mar 08 05:49:39 crc kubenswrapper[4717]: E0308 05:49:39.192183 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="sg-core" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192190 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="sg-core" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192349 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="proxy-httpd" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192360 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-central-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192371 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" containerName="watcher-decision-engine" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192381 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="sg-core" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.192394 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" containerName="ceilometer-notification-agent" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.193032 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.196033 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.211815 4717 scope.go:117] "RemoveContainer" containerID="27e4dc0f1597cc6f120537d22a1b9a6525bebb4e9da3de169766946a735211ed" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.218838 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.221912 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.246216 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.246459 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.252570 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.267162 4717 scope.go:117] "RemoveContainer" containerID="0add4af343d137d26183d105d00e179f0852cbbbac3c3042dda2cc8e80de3572" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.276745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47tnn\" (UniqueName: \"kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-logs\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkvqz\" (UniqueName: \"kubernetes.io/projected/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-kube-api-access-kkvqz\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294664 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.294757 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.395727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47tnn\" (UniqueName: \"kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396150 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396166 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396190 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-logs\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396214 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkvqz\" (UniqueName: \"kubernetes.io/projected/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-kube-api-access-kkvqz\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.396360 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.397109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-logs\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.397267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.397513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.401874 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.402600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.404257 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.407291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.407441 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.410746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.411481 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.414769 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47tnn\" (UniqueName: \"kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn\") pod \"ceilometer-0\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.415186 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkvqz\" (UniqueName: \"kubernetes.io/projected/e1b3240f-d4a8-409e-a8bf-a2f2d03ac126-kube-api-access-kkvqz\") pod \"watcher-decision-engine-0\" (UID: \"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126\") " pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.518204 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.566274 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.792695 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6b6787-6f7e-42d4-ac9b-6804c46381b1" path="/var/lib/kubelet/pods/3d6b6787-6f7e-42d4-ac9b-6804c46381b1/volumes" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.793658 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1e1153-7406-40e6-bbf3-4d48e1f5055c" path="/var/lib/kubelet/pods/cf1e1153-7406-40e6-bbf3-4d48e1f5055c/volumes" Mar 08 05:49:39 crc kubenswrapper[4717]: I0308 05:49:39.937940 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:40 crc kubenswrapper[4717]: I0308 05:49:40.040642 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 08 05:49:40 crc kubenswrapper[4717]: I0308 05:49:40.116950 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:40 crc kubenswrapper[4717]: W0308 05:49:40.134091 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21222ccb_0409_43eb_8833_3820b0f56e7b.slice/crio-a6254469d45ce7df3d8558c80642d8cbd661ee823ed883545b3fe88a0795dd54 WatchSource:0}: Error finding container a6254469d45ce7df3d8558c80642d8cbd661ee823ed883545b3fe88a0795dd54: Status 404 returned error can't find the container with id a6254469d45ce7df3d8558c80642d8cbd661ee823ed883545b3fe88a0795dd54 Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.031803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126","Type":"ContainerStarted","Data":"5d960a8100523b791ab966e412ba703083fd665b0b0e146bc90e326de42aed2d"} Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.032257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e1b3240f-d4a8-409e-a8bf-a2f2d03ac126","Type":"ContainerStarted","Data":"e91d1fb545ecfd5ee2c291dd479511c48af381c2ecf35ee2a89f4d23882d6462"} Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.035613 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerStarted","Data":"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5"} Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.035643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerStarted","Data":"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341"} Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.035654 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerStarted","Data":"a6254469d45ce7df3d8558c80642d8cbd661ee823ed883545b3fe88a0795dd54"} Mar 08 05:49:41 crc kubenswrapper[4717]: I0308 05:49:41.057118 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.057099842 podStartE2EDuration="2.057099842s" podCreationTimestamp="2026-03-08 05:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:41.048549371 +0000 UTC m=+1407.966198215" watchObservedRunningTime="2026-03-08 05:49:41.057099842 +0000 UTC m=+1407.974748686" Mar 08 05:49:44 crc kubenswrapper[4717]: I0308 05:49:44.077495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerStarted","Data":"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3"} Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerStarted","Data":"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08"} Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098804 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098497 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-central-agent" containerID="cri-o://48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341" gracePeriod=30 Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098827 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="sg-core" containerID="cri-o://e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3" gracePeriod=30 Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098808 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="proxy-httpd" containerID="cri-o://92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08" gracePeriod=30 Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.098901 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-notification-agent" containerID="cri-o://f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5" gracePeriod=30 Mar 08 05:49:46 crc kubenswrapper[4717]: I0308 05:49:46.123309 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.202889111 podStartE2EDuration="7.123295293s" podCreationTimestamp="2026-03-08 05:49:39 +0000 UTC" firstStartedPulling="2026-03-08 05:49:40.13661443 +0000 UTC m=+1407.054263274" lastFinishedPulling="2026-03-08 05:49:45.057020612 +0000 UTC m=+1411.974669456" observedRunningTime="2026-03-08 05:49:46.120283908 +0000 UTC m=+1413.037932752" watchObservedRunningTime="2026-03-08 05:49:46.123295293 +0000 UTC m=+1413.040944127" Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.111945 4717 generic.go:334] "Generic (PLEG): container finished" podID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerID="92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08" exitCode=0 Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.112248 4717 generic.go:334] "Generic (PLEG): container finished" podID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerID="e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3" exitCode=2 Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.112259 4717 generic.go:334] "Generic (PLEG): container finished" podID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerID="f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5" exitCode=0 Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.112126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerDied","Data":"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08"} Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.112295 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerDied","Data":"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3"} Mar 08 05:49:47 crc kubenswrapper[4717]: I0308 05:49:47.112315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerDied","Data":"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5"} Mar 08 05:49:48 crc kubenswrapper[4717]: E0308 05:49:48.302330 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:49:49 crc kubenswrapper[4717]: I0308 05:49:49.518615 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:49 crc kubenswrapper[4717]: I0308 05:49:49.554252 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:50 crc kubenswrapper[4717]: I0308 05:49:50.179203 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:50 crc kubenswrapper[4717]: I0308 05:49:50.229116 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.217854 4717 generic.go:334] "Generic (PLEG): container finished" podID="1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" containerID="97368cb953e6d8fae4e88a61a4a2fd4420986ca65ff922b3047eab800868691f" exitCode=0 Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.218039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" event={"ID":"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4","Type":"ContainerDied","Data":"97368cb953e6d8fae4e88a61a4a2fd4420986ca65ff922b3047eab800868691f"} Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.797835 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976135 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976241 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976612 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47tnn\" (UniqueName: \"kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976723 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.976779 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd\") pod \"21222ccb-0409-43eb-8833-3820b0f56e7b\" (UID: \"21222ccb-0409-43eb-8833-3820b0f56e7b\") " Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.977156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.977479 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.977715 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.982275 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn" (OuterVolumeSpecName: "kube-api-access-47tnn") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "kube-api-access-47tnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:54 crc kubenswrapper[4717]: I0308 05:49:54.985822 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts" (OuterVolumeSpecName: "scripts") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.007792 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.074012 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.078614 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.078815 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.078891 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47tnn\" (UniqueName: \"kubernetes.io/projected/21222ccb-0409-43eb-8833-3820b0f56e7b-kube-api-access-47tnn\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.078951 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.079000 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21222ccb-0409-43eb-8833-3820b0f56e7b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.116963 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data" (OuterVolumeSpecName: "config-data") pod "21222ccb-0409-43eb-8833-3820b0f56e7b" (UID: "21222ccb-0409-43eb-8833-3820b0f56e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.180265 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21222ccb-0409-43eb-8833-3820b0f56e7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.229173 4717 generic.go:334] "Generic (PLEG): container finished" podID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerID="48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341" exitCode=0 Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.229257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerDied","Data":"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341"} Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.229303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21222ccb-0409-43eb-8833-3820b0f56e7b","Type":"ContainerDied","Data":"a6254469d45ce7df3d8558c80642d8cbd661ee823ed883545b3fe88a0795dd54"} Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.229320 4717 scope.go:117] "RemoveContainer" containerID="92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.230456 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.255434 4717 scope.go:117] "RemoveContainer" containerID="e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.281478 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.301070 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.302463 4717 scope.go:117] "RemoveContainer" containerID="f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309293 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.309633 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="proxy-httpd" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309647 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="proxy-httpd" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.309658 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-notification-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309664 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-notification-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.309708 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="sg-core" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309714 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="sg-core" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.309735 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-central-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309741 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-central-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309910 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-central-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309929 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="ceilometer-notification-agent" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309939 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="proxy-httpd" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.309946 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" containerName="sg-core" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.312738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.316835 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.317275 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.335337 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.370580 4717 scope.go:117] "RemoveContainer" containerID="48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.413023 4717 scope.go:117] "RemoveContainer" containerID="92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.413523 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08\": container with ID starting with 92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08 not found: ID does not exist" containerID="92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.413569 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08"} err="failed to get container status \"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08\": rpc error: code = NotFound desc = could not find container \"92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08\": container with ID starting with 92404015c64f23d40c5058cf53ec8ede022c2e8ccb46914af6ffa3604cca4b08 not found: ID does not exist" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.413601 4717 scope.go:117] "RemoveContainer" containerID="e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.413948 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3\": container with ID starting with e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3 not found: ID does not exist" containerID="e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.413982 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3"} err="failed to get container status \"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3\": rpc error: code = NotFound desc = could not find container \"e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3\": container with ID starting with e6325c29c242c3728a5a16774d5e877c1a21a1bd421fd8d450bc220c5246bbd3 not found: ID does not exist" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.414018 4717 scope.go:117] "RemoveContainer" containerID="f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.414248 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5\": container with ID starting with f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5 not found: ID does not exist" containerID="f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.414279 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5"} err="failed to get container status \"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5\": rpc error: code = NotFound desc = could not find container \"f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5\": container with ID starting with f8ddf9e2b93622d4a869504be3b40e33d0c9f4bbb60ad0d970f38cad8c8c73f5 not found: ID does not exist" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.414296 4717 scope.go:117] "RemoveContainer" containerID="48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341" Mar 08 05:49:55 crc kubenswrapper[4717]: E0308 05:49:55.414586 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341\": container with ID starting with 48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341 not found: ID does not exist" containerID="48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.414614 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341"} err="failed to get container status \"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341\": rpc error: code = NotFound desc = could not find container \"48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341\": container with ID starting with 48c9809a8ce331ff64ae207836915067707605d4d7a2a4f9e506439a40e8b341 not found: ID does not exist" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485246 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnrb\" (UniqueName: \"kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485291 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485562 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.485869 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587521 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587569 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587615 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.587681 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnrb\" (UniqueName: \"kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.588566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.588826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.594273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.594838 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.603094 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.603444 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.605405 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnrb\" (UniqueName: \"kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb\") pod \"ceilometer-0\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.630565 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.730741 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.793367 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21222ccb-0409-43eb-8833-3820b0f56e7b" path="/var/lib/kubelet/pods/21222ccb-0409-43eb-8833-3820b0f56e7b/volumes" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.892174 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle\") pod \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.892451 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z5sg\" (UniqueName: \"kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg\") pod \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.892482 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts\") pod \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.892556 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data\") pod \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\" (UID: \"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4\") " Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.896116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts" (OuterVolumeSpecName: "scripts") pod "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" (UID: "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.896824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg" (OuterVolumeSpecName: "kube-api-access-4z5sg") pod "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" (UID: "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4"). InnerVolumeSpecName "kube-api-access-4z5sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.928502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data" (OuterVolumeSpecName: "config-data") pod "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" (UID: "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.955257 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" (UID: "1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.995018 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.995054 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z5sg\" (UniqueName: \"kubernetes.io/projected/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-kube-api-access-4z5sg\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.995067 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:55 crc kubenswrapper[4717]: I0308 05:49:55.995076 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.055977 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:49:56 crc kubenswrapper[4717]: W0308 05:49:56.060121 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc WatchSource:0}: Error finding container fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc: Status 404 returned error can't find the container with id fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.247975 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" event={"ID":"1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4","Type":"ContainerDied","Data":"631d46ece090d6c055b3b03af82712987fe4d442315659db81adfb5562d8a097"} Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.248035 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="631d46ece090d6c055b3b03af82712987fe4d442315659db81adfb5562d8a097" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.248168 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x6mmq" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.250214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerStarted","Data":"fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc"} Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.377766 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 05:49:56 crc kubenswrapper[4717]: E0308 05:49:56.381879 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" containerName="nova-cell0-conductor-db-sync" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.382089 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" containerName="nova-cell0-conductor-db-sync" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.383567 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" containerName="nova-cell0-conductor-db-sync" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.385830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.386315 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.396452 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2r54d" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.396649 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.416744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x22h6\" (UniqueName: \"kubernetes.io/projected/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-kube-api-access-x22h6\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.416814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.416832 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.518298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x22h6\" (UniqueName: \"kubernetes.io/projected/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-kube-api-access-x22h6\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.518732 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.518761 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.525571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.533553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.537695 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x22h6\" (UniqueName: \"kubernetes.io/projected/398b8cef-0b8b-4e8f-80c2-2afa74fa75be-kube-api-access-x22h6\") pod \"nova-cell0-conductor-0\" (UID: \"398b8cef-0b8b-4e8f-80c2-2afa74fa75be\") " pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:56 crc kubenswrapper[4717]: I0308 05:49:56.783085 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:57 crc kubenswrapper[4717]: I0308 05:49:57.266217 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerStarted","Data":"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3"} Mar 08 05:49:57 crc kubenswrapper[4717]: I0308 05:49:57.266676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerStarted","Data":"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438"} Mar 08 05:49:57 crc kubenswrapper[4717]: I0308 05:49:57.268267 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 05:49:57 crc kubenswrapper[4717]: W0308 05:49:57.271938 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398b8cef_0b8b_4e8f_80c2_2afa74fa75be.slice/crio-9034aabbe4a9274a8c401bc53d609c11325357606f28338c0d049fb999f201a7 WatchSource:0}: Error finding container 9034aabbe4a9274a8c401bc53d609c11325357606f28338c0d049fb999f201a7: Status 404 returned error can't find the container with id 9034aabbe4a9274a8c401bc53d609c11325357606f28338c0d049fb999f201a7 Mar 08 05:49:58 crc kubenswrapper[4717]: I0308 05:49:58.276442 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"398b8cef-0b8b-4e8f-80c2-2afa74fa75be","Type":"ContainerStarted","Data":"91e3d6c69cc153376408e925b714d9546d99c8ed4396464ef67e331d93f8028a"} Mar 08 05:49:58 crc kubenswrapper[4717]: I0308 05:49:58.276676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"398b8cef-0b8b-4e8f-80c2-2afa74fa75be","Type":"ContainerStarted","Data":"9034aabbe4a9274a8c401bc53d609c11325357606f28338c0d049fb999f201a7"} Mar 08 05:49:58 crc kubenswrapper[4717]: I0308 05:49:58.277698 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 05:49:58 crc kubenswrapper[4717]: I0308 05:49:58.280875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerStarted","Data":"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac"} Mar 08 05:49:58 crc kubenswrapper[4717]: E0308 05:49:58.628545 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.139139 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=4.13911729 podStartE2EDuration="4.13911729s" podCreationTimestamp="2026-03-08 05:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:49:58.292919345 +0000 UTC m=+1425.210568189" watchObservedRunningTime="2026-03-08 05:50:00.13911729 +0000 UTC m=+1427.056766134" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.143532 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549150-rddpb"] Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.145894 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.149269 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.149530 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.151251 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549150-rddpb"] Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.166583 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.307019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerStarted","Data":"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835"} Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.307966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.310856 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhmw\" (UniqueName: \"kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw\") pod \"auto-csr-approver-29549150-rddpb\" (UID: \"cf7cd813-172e-475f-aef7-609e9932b290\") " pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.365283 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.356107853 podStartE2EDuration="5.365261332s" podCreationTimestamp="2026-03-08 05:49:55 +0000 UTC" firstStartedPulling="2026-03-08 05:49:56.063411995 +0000 UTC m=+1422.981060839" lastFinishedPulling="2026-03-08 05:49:59.072565444 +0000 UTC m=+1425.990214318" observedRunningTime="2026-03-08 05:50:00.339060574 +0000 UTC m=+1427.256709448" watchObservedRunningTime="2026-03-08 05:50:00.365261332 +0000 UTC m=+1427.282910176" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.412197 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhmw\" (UniqueName: \"kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw\") pod \"auto-csr-approver-29549150-rddpb\" (UID: \"cf7cd813-172e-475f-aef7-609e9932b290\") " pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.432734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhmw\" (UniqueName: \"kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw\") pod \"auto-csr-approver-29549150-rddpb\" (UID: \"cf7cd813-172e-475f-aef7-609e9932b290\") " pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:00 crc kubenswrapper[4717]: I0308 05:50:00.466572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:01 crc kubenswrapper[4717]: I0308 05:50:01.012715 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549150-rddpb"] Mar 08 05:50:01 crc kubenswrapper[4717]: W0308 05:50:01.057738 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf7cd813_172e_475f_aef7_609e9932b290.slice/crio-ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832 WatchSource:0}: Error finding container ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832: Status 404 returned error can't find the container with id ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832 Mar 08 05:50:01 crc kubenswrapper[4717]: I0308 05:50:01.317139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549150-rddpb" event={"ID":"cf7cd813-172e-475f-aef7-609e9932b290","Type":"ContainerStarted","Data":"ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832"} Mar 08 05:50:03 crc kubenswrapper[4717]: I0308 05:50:03.337116 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf7cd813-172e-475f-aef7-609e9932b290" containerID="64e508a9372bb09e04233f401018389b5d43f61eda7d881c7626436d11a2d294" exitCode=0 Mar 08 05:50:03 crc kubenswrapper[4717]: I0308 05:50:03.337204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549150-rddpb" event={"ID":"cf7cd813-172e-475f-aef7-609e9932b290","Type":"ContainerDied","Data":"64e508a9372bb09e04233f401018389b5d43f61eda7d881c7626436d11a2d294"} Mar 08 05:50:04 crc kubenswrapper[4717]: I0308 05:50:04.812539 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:04 crc kubenswrapper[4717]: I0308 05:50:04.904450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwhmw\" (UniqueName: \"kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw\") pod \"cf7cd813-172e-475f-aef7-609e9932b290\" (UID: \"cf7cd813-172e-475f-aef7-609e9932b290\") " Mar 08 05:50:04 crc kubenswrapper[4717]: I0308 05:50:04.911932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw" (OuterVolumeSpecName: "kube-api-access-lwhmw") pod "cf7cd813-172e-475f-aef7-609e9932b290" (UID: "cf7cd813-172e-475f-aef7-609e9932b290"). InnerVolumeSpecName "kube-api-access-lwhmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.007103 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwhmw\" (UniqueName: \"kubernetes.io/projected/cf7cd813-172e-475f-aef7-609e9932b290-kube-api-access-lwhmw\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.375003 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549150-rddpb" event={"ID":"cf7cd813-172e-475f-aef7-609e9932b290","Type":"ContainerDied","Data":"ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832"} Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.375066 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb45ea408c52c49c754e6aa5e7649a03d339d67aee1ab45921307692a7e4832" Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.375112 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549150-rddpb" Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.921262 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549144-cfx7q"] Mar 08 05:50:05 crc kubenswrapper[4717]: I0308 05:50:05.932706 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549144-cfx7q"] Mar 08 05:50:06 crc kubenswrapper[4717]: I0308 05:50:06.828342 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.433293 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dgm67"] Mar 08 05:50:07 crc kubenswrapper[4717]: E0308 05:50:07.434216 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7cd813-172e-475f-aef7-609e9932b290" containerName="oc" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.434238 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7cd813-172e-475f-aef7-609e9932b290" containerName="oc" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.434713 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7cd813-172e-475f-aef7-609e9932b290" containerName="oc" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.436064 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.440543 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.440842 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.458722 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgm67"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.563935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.564004 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vts2g\" (UniqueName: \"kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.564029 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.564059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.580938 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.582738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.584129 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.596611 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.643440 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.645797 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.649760 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.659147 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.660774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.662569 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.669782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.669838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5r7t\" (UniqueName: \"kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.669881 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.669916 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.669981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.670021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vts2g\" (UniqueName: \"kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.670041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.670796 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.675626 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.682905 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.686766 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.687306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.708388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vts2g\" (UniqueName: \"kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g\") pod \"nova-cell0-cell-mapping-dgm67\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.777879 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779080 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5r7t\" (UniqueName: \"kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vdt\" (UniqueName: \"kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779179 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779259 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cml7k\" (UniqueName: \"kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.779932 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.782483 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.809158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.814936 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008148d1-0dc4-4b2d-a69c-be8ee1b204c0" path="/var/lib/kubelet/pods/008148d1-0dc4-4b2d-a69c-be8ee1b204c0/volumes" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.821062 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.825302 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.840782 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.856414 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.859605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5r7t\" (UniqueName: \"kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t\") pod \"nova-cell1-novncproxy-0\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.890637 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.890898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cml7k\" (UniqueName: \"kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.890994 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.891110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vdt\" (UniqueName: \"kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.891185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.891245 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.891347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.899298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.900518 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.905945 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.906379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.906387 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.913277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.976367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vdt\" (UniqueName: \"kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt\") pod \"nova-api-0\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " pod="openstack/nova-api-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.980614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cml7k\" (UniqueName: \"kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k\") pod \"nova-scheduler-0\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.988968 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.990540 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.992766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.993777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.994057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbv8z\" (UniqueName: \"kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:07 crc kubenswrapper[4717]: I0308 05:50:07.994153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.001613 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.069190 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.098609 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.098863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchj7\" (UniqueName: \"kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099251 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099341 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099620 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.099712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbv8z\" (UniqueName: \"kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.101144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.111706 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.119075 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.122586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbv8z\" (UniqueName: \"kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z\") pod \"nova-metadata-0\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.201631 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchj7\" (UniqueName: \"kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.201957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.201987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.202008 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.202044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.202079 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.202749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.202912 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.203293 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.203931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.204277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.217521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchj7\" (UniqueName: \"kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7\") pod \"dnsmasq-dns-575bd44df9-5d8wr\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.263014 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.349936 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.387082 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.445487 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgm67"] Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.620273 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:08 crc kubenswrapper[4717]: W0308 05:50:08.628939 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda867b873_1e1b_4104_ae6d_34f2d216c3ca.slice/crio-c5a23b6ca3aad1d6d28665b167ba150ac248ed895d60add99e15fd1623c098f8 WatchSource:0}: Error finding container c5a23b6ca3aad1d6d28665b167ba150ac248ed895d60add99e15fd1623c098f8: Status 404 returned error can't find the container with id c5a23b6ca3aad1d6d28665b167ba150ac248ed895d60add99e15fd1623c098f8 Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.798583 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.830224 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s2nrg"] Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.833227 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.838311 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.838585 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.900434 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s2nrg"] Mar 08 05:50:08 crc kubenswrapper[4717]: I0308 05:50:08.917892 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:08 crc kubenswrapper[4717]: E0308 05:50:08.954790 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6345fc76_e42d_4a13_90d2_c2bd5135f073.slice/crio-conmon-75fd489306e93d5c0ef9f8b9908abdce79e38e2694c82eb644a925e6d456d319.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.016735 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.024292 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.024440 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.024538 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.024766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fshgw\" (UniqueName: \"kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: W0308 05:50:09.025880 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2f4071_99c2_4755_af9b_ba683a154d22.slice/crio-1ce3fdfc41d8f0eaae914cc19d1eb012b46ab97fae33602352d9ff76026eca5b WatchSource:0}: Error finding container 1ce3fdfc41d8f0eaae914cc19d1eb012b46ab97fae33602352d9ff76026eca5b: Status 404 returned error can't find the container with id 1ce3fdfc41d8f0eaae914cc19d1eb012b46ab97fae33602352d9ff76026eca5b Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.126896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.126949 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.127020 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fshgw\" (UniqueName: \"kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.127118 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.131503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.131534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.145567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.153742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fshgw\" (UniqueName: \"kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw\") pod \"nova-cell1-conductor-db-sync-s2nrg\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.174355 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.228508 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.450505 4717 generic.go:334] "Generic (PLEG): container finished" podID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerID="1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5" exitCode=0 Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.450834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" event={"ID":"0a2f4071-99c2-4755-af9b-ba683a154d22","Type":"ContainerDied","Data":"1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.450857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" event={"ID":"0a2f4071-99c2-4755-af9b-ba683a154d22","Type":"ContainerStarted","Data":"1ce3fdfc41d8f0eaae914cc19d1eb012b46ab97fae33602352d9ff76026eca5b"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.452470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a867b873-1e1b-4104-ae6d-34f2d216c3ca","Type":"ContainerStarted","Data":"c5a23b6ca3aad1d6d28665b167ba150ac248ed895d60add99e15fd1623c098f8"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.453636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e1eab09-61ae-43df-9173-7267107e9f24","Type":"ContainerStarted","Data":"9db180491c91035e48b830586a5b562c3f31e6a5b2768b448d6411543b3fdcf2"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.455044 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerStarted","Data":"b21ec826f598171846cc13bc4483b48c3588d74885ef5d97c020b4980933e67e"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.458186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerStarted","Data":"3d485895efb0093ad7e0d61def4aa1cb638e82c48c8337cf2432996207f39ac8"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.462841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgm67" event={"ID":"29ac0da5-7bbd-420a-b56d-60c621244d30","Type":"ContainerStarted","Data":"00de9b162608c5f1411ab6e99a593f66f0e3e33b85a7f00a215e414c16aa2735"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.462883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgm67" event={"ID":"29ac0da5-7bbd-420a-b56d-60c621244d30","Type":"ContainerStarted","Data":"9aadc883df64377a232a7cadfb2dd8a3f638b985d885fa14a47727e9c83f2234"} Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.495670 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dgm67" podStartSLOduration=2.495651812 podStartE2EDuration="2.495651812s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:09.481908502 +0000 UTC m=+1436.399557346" watchObservedRunningTime="2026-03-08 05:50:09.495651812 +0000 UTC m=+1436.413300646" Mar 08 05:50:09 crc kubenswrapper[4717]: I0308 05:50:09.704825 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s2nrg"] Mar 08 05:50:10 crc kubenswrapper[4717]: I0308 05:50:10.475305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" event={"ID":"0a2f4071-99c2-4755-af9b-ba683a154d22","Type":"ContainerStarted","Data":"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab"} Mar 08 05:50:10 crc kubenswrapper[4717]: I0308 05:50:10.475578 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:10 crc kubenswrapper[4717]: I0308 05:50:10.499322 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" podStartSLOduration=3.499306561 podStartE2EDuration="3.499306561s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:10.495854516 +0000 UTC m=+1437.413503380" watchObservedRunningTime="2026-03-08 05:50:10.499306561 +0000 UTC m=+1437.416955395" Mar 08 05:50:11 crc kubenswrapper[4717]: I0308 05:50:11.483676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" event={"ID":"84a798f5-2296-45b1-ad1e-5d31f85c67d3","Type":"ContainerStarted","Data":"396d439933a1bc5bf65eb9e92af0bd7782f39154f8b36c54955211a2aafa1de5"} Mar 08 05:50:11 crc kubenswrapper[4717]: I0308 05:50:11.517750 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:11 crc kubenswrapper[4717]: I0308 05:50:11.533450 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.498534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerStarted","Data":"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.500231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerStarted","Data":"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.498629 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-metadata" containerID="cri-o://03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" gracePeriod=30 Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.498595 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-log" containerID="cri-o://801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" gracePeriod=30 Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.500774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerStarted","Data":"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.500817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerStarted","Data":"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.505216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" event={"ID":"84a798f5-2296-45b1-ad1e-5d31f85c67d3","Type":"ContainerStarted","Data":"0a66c7f0fe537f6e65ae7177f15bb0f6afa0ed616a621d68f28485607d40f336"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.511139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a867b873-1e1b-4104-ae6d-34f2d216c3ca","Type":"ContainerStarted","Data":"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.511262 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce" gracePeriod=30 Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.513528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e1eab09-61ae-43df-9173-7267107e9f24","Type":"ContainerStarted","Data":"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595"} Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.523920 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.479654416 podStartE2EDuration="5.523902809s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="2026-03-08 05:50:09.185751818 +0000 UTC m=+1436.103400662" lastFinishedPulling="2026-03-08 05:50:11.230000211 +0000 UTC m=+1438.147649055" observedRunningTime="2026-03-08 05:50:12.521677884 +0000 UTC m=+1439.439326758" watchObservedRunningTime="2026-03-08 05:50:12.523902809 +0000 UTC m=+1439.441551653" Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.546271 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.190074575 podStartE2EDuration="5.546252402s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="2026-03-08 05:50:08.873941137 +0000 UTC m=+1435.791589981" lastFinishedPulling="2026-03-08 05:50:11.230118934 +0000 UTC m=+1438.147767808" observedRunningTime="2026-03-08 05:50:12.542887978 +0000 UTC m=+1439.460536812" watchObservedRunningTime="2026-03-08 05:50:12.546252402 +0000 UTC m=+1439.463901246" Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.564845 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.087685913 podStartE2EDuration="5.564827421s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="2026-03-08 05:50:08.783233844 +0000 UTC m=+1435.700882678" lastFinishedPulling="2026-03-08 05:50:11.260375302 +0000 UTC m=+1438.178024186" observedRunningTime="2026-03-08 05:50:12.560605867 +0000 UTC m=+1439.478254721" watchObservedRunningTime="2026-03-08 05:50:12.564827421 +0000 UTC m=+1439.482476265" Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.590169 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" podStartSLOduration=4.590145597 podStartE2EDuration="4.590145597s" podCreationTimestamp="2026-03-08 05:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:12.584139329 +0000 UTC m=+1439.501788173" watchObservedRunningTime="2026-03-08 05:50:12.590145597 +0000 UTC m=+1439.507794451" Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.604583 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.01032723 podStartE2EDuration="5.604564784s" podCreationTimestamp="2026-03-08 05:50:07 +0000 UTC" firstStartedPulling="2026-03-08 05:50:08.635951922 +0000 UTC m=+1435.553600766" lastFinishedPulling="2026-03-08 05:50:11.230189466 +0000 UTC m=+1438.147838320" observedRunningTime="2026-03-08 05:50:12.600070342 +0000 UTC m=+1439.517719206" watchObservedRunningTime="2026-03-08 05:50:12.604564784 +0000 UTC m=+1439.522213618" Mar 08 05:50:12 crc kubenswrapper[4717]: I0308 05:50:12.901346 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.109294 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.117749 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle\") pod \"a8d0ca92-311c-46a4-b179-74a5091722ca\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.118227 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs\") pod \"a8d0ca92-311c-46a4-b179-74a5091722ca\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.118554 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data\") pod \"a8d0ca92-311c-46a4-b179-74a5091722ca\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.118816 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs" (OuterVolumeSpecName: "logs") pod "a8d0ca92-311c-46a4-b179-74a5091722ca" (UID: "a8d0ca92-311c-46a4-b179-74a5091722ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.119279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbv8z\" (UniqueName: \"kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z\") pod \"a8d0ca92-311c-46a4-b179-74a5091722ca\" (UID: \"a8d0ca92-311c-46a4-b179-74a5091722ca\") " Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.120499 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d0ca92-311c-46a4-b179-74a5091722ca-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.127451 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z" (OuterVolumeSpecName: "kube-api-access-lbv8z") pod "a8d0ca92-311c-46a4-b179-74a5091722ca" (UID: "a8d0ca92-311c-46a4-b179-74a5091722ca"). InnerVolumeSpecName "kube-api-access-lbv8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.178948 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data" (OuterVolumeSpecName: "config-data") pod "a8d0ca92-311c-46a4-b179-74a5091722ca" (UID: "a8d0ca92-311c-46a4-b179-74a5091722ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.181853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8d0ca92-311c-46a4-b179-74a5091722ca" (UID: "a8d0ca92-311c-46a4-b179-74a5091722ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.223003 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.223034 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbv8z\" (UniqueName: \"kubernetes.io/projected/a8d0ca92-311c-46a4-b179-74a5091722ca-kube-api-access-lbv8z\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.223044 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d0ca92-311c-46a4-b179-74a5091722ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.264100 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.527703 4717 generic.go:334] "Generic (PLEG): container finished" podID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerID="03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" exitCode=0 Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.527953 4717 generic.go:334] "Generic (PLEG): container finished" podID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerID="801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" exitCode=143 Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.528194 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerDied","Data":"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951"} Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.528243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerDied","Data":"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02"} Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.528257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8d0ca92-311c-46a4-b179-74a5091722ca","Type":"ContainerDied","Data":"b21ec826f598171846cc13bc4483b48c3588d74885ef5d97c020b4980933e67e"} Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.528278 4717 scope.go:117] "RemoveContainer" containerID="03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.528411 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.561662 4717 scope.go:117] "RemoveContainer" containerID="801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.580604 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.586551 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.603315 4717 scope.go:117] "RemoveContainer" containerID="03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" Mar 08 05:50:13 crc kubenswrapper[4717]: E0308 05:50:13.608155 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951\": container with ID starting with 03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951 not found: ID does not exist" containerID="03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.608318 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951"} err="failed to get container status \"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951\": rpc error: code = NotFound desc = could not find container \"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951\": container with ID starting with 03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951 not found: ID does not exist" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.608384 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:13 crc kubenswrapper[4717]: E0308 05:50:13.608823 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-metadata" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.608840 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-metadata" Mar 08 05:50:13 crc kubenswrapper[4717]: E0308 05:50:13.608854 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-log" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.608860 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-log" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.608395 4717 scope.go:117] "RemoveContainer" containerID="801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.609043 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-metadata" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.609064 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" containerName="nova-metadata-log" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.610059 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: E0308 05:50:13.617107 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02\": container with ID starting with 801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02 not found: ID does not exist" containerID="801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.617149 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02"} err="failed to get container status \"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02\": rpc error: code = NotFound desc = could not find container \"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02\": container with ID starting with 801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02 not found: ID does not exist" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.617175 4717 scope.go:117] "RemoveContainer" containerID="03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.617496 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.618132 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951"} err="failed to get container status \"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951\": rpc error: code = NotFound desc = could not find container \"03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951\": container with ID starting with 03c4a6f987b2768f992c27d418785f75df99cff9e6b4cb23f94130c6782dd951 not found: ID does not exist" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.618153 4717 scope.go:117] "RemoveContainer" containerID="801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.618321 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02"} err="failed to get container status \"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02\": rpc error: code = NotFound desc = could not find container \"801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02\": container with ID starting with 801dd6eba222c952e3d868fcc8455056a486e9afa327341815bef0b1edc20b02 not found: ID does not exist" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.624707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.629270 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.733366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.733475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.733515 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.733553 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.733703 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qjn\" (UniqueName: \"kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.796581 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d0ca92-311c-46a4-b179-74a5091722ca" path="/var/lib/kubelet/pods/a8d0ca92-311c-46a4-b179-74a5091722ca/volumes" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835087 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qjn\" (UniqueName: \"kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.835743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.836864 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.837018 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.839032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.848285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.850324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.859285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qjn\" (UniqueName: \"kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn\") pod \"nova-metadata-0\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " pod="openstack/nova-metadata-0" Mar 08 05:50:13 crc kubenswrapper[4717]: I0308 05:50:13.963271 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:14 crc kubenswrapper[4717]: I0308 05:50:14.392992 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:14 crc kubenswrapper[4717]: I0308 05:50:14.541123 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerStarted","Data":"492dcd550d336145aab5cf7fa4e69ecc1cade9104906428d8657a5d0d1878d61"} Mar 08 05:50:15 crc kubenswrapper[4717]: I0308 05:50:15.561007 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerStarted","Data":"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60"} Mar 08 05:50:15 crc kubenswrapper[4717]: I0308 05:50:15.561528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerStarted","Data":"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef"} Mar 08 05:50:15 crc kubenswrapper[4717]: I0308 05:50:15.594843 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.59480707 podStartE2EDuration="2.59480707s" podCreationTimestamp="2026-03-08 05:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:15.586272229 +0000 UTC m=+1442.503921133" watchObservedRunningTime="2026-03-08 05:50:15.59480707 +0000 UTC m=+1442.512455954" Mar 08 05:50:16 crc kubenswrapper[4717]: I0308 05:50:16.575819 4717 generic.go:334] "Generic (PLEG): container finished" podID="29ac0da5-7bbd-420a-b56d-60c621244d30" containerID="00de9b162608c5f1411ab6e99a593f66f0e3e33b85a7f00a215e414c16aa2735" exitCode=0 Mar 08 05:50:16 crc kubenswrapper[4717]: I0308 05:50:16.575962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgm67" event={"ID":"29ac0da5-7bbd-420a-b56d-60c621244d30","Type":"ContainerDied","Data":"00de9b162608c5f1411ab6e99a593f66f0e3e33b85a7f00a215e414c16aa2735"} Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.049981 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.071307 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.071571 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.229068 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data\") pod \"29ac0da5-7bbd-420a-b56d-60c621244d30\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.229614 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle\") pod \"29ac0da5-7bbd-420a-b56d-60c621244d30\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.229757 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts\") pod \"29ac0da5-7bbd-420a-b56d-60c621244d30\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.229921 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vts2g\" (UniqueName: \"kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g\") pod \"29ac0da5-7bbd-420a-b56d-60c621244d30\" (UID: \"29ac0da5-7bbd-420a-b56d-60c621244d30\") " Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.256276 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts" (OuterVolumeSpecName: "scripts") pod "29ac0da5-7bbd-420a-b56d-60c621244d30" (UID: "29ac0da5-7bbd-420a-b56d-60c621244d30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.256375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g" (OuterVolumeSpecName: "kube-api-access-vts2g") pod "29ac0da5-7bbd-420a-b56d-60c621244d30" (UID: "29ac0da5-7bbd-420a-b56d-60c621244d30"). InnerVolumeSpecName "kube-api-access-vts2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.263098 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29ac0da5-7bbd-420a-b56d-60c621244d30" (UID: "29ac0da5-7bbd-420a-b56d-60c621244d30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.263216 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.265283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data" (OuterVolumeSpecName: "config-data") pod "29ac0da5-7bbd-420a-b56d-60c621244d30" (UID: "29ac0da5-7bbd-420a-b56d-60c621244d30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.300123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.333346 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.333385 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vts2g\" (UniqueName: \"kubernetes.io/projected/29ac0da5-7bbd-420a-b56d-60c621244d30-kube-api-access-vts2g\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.333397 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.333409 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ac0da5-7bbd-420a-b56d-60c621244d30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.388834 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.458188 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.458428 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="dnsmasq-dns" containerID="cri-o://6d17d523fafd41c0a08ce93a9f16c397a98b918f76bb985c91310a55468a5d3f" gracePeriod=10 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.609090 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgm67" event={"ID":"29ac0da5-7bbd-420a-b56d-60c621244d30","Type":"ContainerDied","Data":"9aadc883df64377a232a7cadfb2dd8a3f638b985d885fa14a47727e9c83f2234"} Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.609131 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aadc883df64377a232a7cadfb2dd8a3f638b985d885fa14a47727e9c83f2234" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.609181 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgm67" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.612960 4717 generic.go:334] "Generic (PLEG): container finished" podID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerID="6d17d523fafd41c0a08ce93a9f16c397a98b918f76bb985c91310a55468a5d3f" exitCode=0 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.613753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" event={"ID":"41e72e63-490d-46aa-b4ff-68e33f7def1c","Type":"ContainerDied","Data":"6d17d523fafd41c0a08ce93a9f16c397a98b918f76bb985c91310a55468a5d3f"} Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.669235 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.776427 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.776773 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-log" containerID="cri-o://5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee" gracePeriod=30 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.776845 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-api" containerID="cri-o://edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138" gracePeriod=30 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.801081 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": EOF" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.804889 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": EOF" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.874128 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.874618 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-log" containerID="cri-o://42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" gracePeriod=30 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.874944 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-metadata" containerID="cri-o://a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" gracePeriod=30 Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.963812 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:50:18 crc kubenswrapper[4717]: I0308 05:50:18.963911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.042131 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.147898 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.148072 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.148131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.148167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.148190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.148231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgfb\" (UniqueName: \"kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb\") pod \"41e72e63-490d-46aa-b4ff-68e33f7def1c\" (UID: \"41e72e63-490d-46aa-b4ff-68e33f7def1c\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.153060 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb" (OuterVolumeSpecName: "kube-api-access-vxgfb") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "kube-api-access-vxgfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.192540 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.217205 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.226519 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config" (OuterVolumeSpecName: "config") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.240309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.253912 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.253948 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.253958 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxgfb\" (UniqueName: \"kubernetes.io/projected/41e72e63-490d-46aa-b4ff-68e33f7def1c-kube-api-access-vxgfb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.253967 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.261979 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.285239 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41e72e63-490d-46aa-b4ff-68e33f7def1c" (UID: "41e72e63-490d-46aa-b4ff-68e33f7def1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.355743 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.355769 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41e72e63-490d-46aa-b4ff-68e33f7def1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.394836 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.558701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2qjn\" (UniqueName: \"kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn\") pod \"95ee8450-26f4-4689-ad41-09426d65b6ff\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.558763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs\") pod \"95ee8450-26f4-4689-ad41-09426d65b6ff\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.558923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs\") pod \"95ee8450-26f4-4689-ad41-09426d65b6ff\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.558955 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle\") pod \"95ee8450-26f4-4689-ad41-09426d65b6ff\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.558995 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data\") pod \"95ee8450-26f4-4689-ad41-09426d65b6ff\" (UID: \"95ee8450-26f4-4689-ad41-09426d65b6ff\") " Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.559564 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs" (OuterVolumeSpecName: "logs") pod "95ee8450-26f4-4689-ad41-09426d65b6ff" (UID: "95ee8450-26f4-4689-ad41-09426d65b6ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.559750 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95ee8450-26f4-4689-ad41-09426d65b6ff-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.563860 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn" (OuterVolumeSpecName: "kube-api-access-g2qjn") pod "95ee8450-26f4-4689-ad41-09426d65b6ff" (UID: "95ee8450-26f4-4689-ad41-09426d65b6ff"). InnerVolumeSpecName "kube-api-access-g2qjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.595498 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data" (OuterVolumeSpecName: "config-data") pod "95ee8450-26f4-4689-ad41-09426d65b6ff" (UID: "95ee8450-26f4-4689-ad41-09426d65b6ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.603211 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ee8450-26f4-4689-ad41-09426d65b6ff" (UID: "95ee8450-26f4-4689-ad41-09426d65b6ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.617558 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "95ee8450-26f4-4689-ad41-09426d65b6ff" (UID: "95ee8450-26f4-4689-ad41-09426d65b6ff"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.629921 4717 generic.go:334] "Generic (PLEG): container finished" podID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerID="5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee" exitCode=143 Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.630018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerDied","Data":"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee"} Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.633930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" event={"ID":"41e72e63-490d-46aa-b4ff-68e33f7def1c","Type":"ContainerDied","Data":"9d2db1dd4933d5a3d32598c00abe59f311debbcd4e7f2aca8d2a7349d70a403b"} Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.633967 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf58bb7c-v2hj5" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.634000 4717 scope.go:117] "RemoveContainer" containerID="6d17d523fafd41c0a08ce93a9f16c397a98b918f76bb985c91310a55468a5d3f" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.639891 4717 generic.go:334] "Generic (PLEG): container finished" podID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerID="a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" exitCode=0 Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.639928 4717 generic.go:334] "Generic (PLEG): container finished" podID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerID="42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" exitCode=143 Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.639985 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.639986 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerDied","Data":"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60"} Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.640112 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerDied","Data":"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef"} Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.640135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95ee8450-26f4-4689-ad41-09426d65b6ff","Type":"ContainerDied","Data":"492dcd550d336145aab5cf7fa4e69ecc1cade9104906428d8657a5d0d1878d61"} Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.661878 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.662039 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.662116 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ee8450-26f4-4689-ad41-09426d65b6ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.662188 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2qjn\" (UniqueName: \"kubernetes.io/projected/95ee8450-26f4-4689-ad41-09426d65b6ff-kube-api-access-g2qjn\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.664632 4717 scope.go:117] "RemoveContainer" containerID="dd0523f63c13e73265160658775fc47c36a80a21d806c65e934dd2569a3f8017" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.689717 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.701123 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fdf58bb7c-v2hj5"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.705128 4717 scope.go:117] "RemoveContainer" containerID="a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.730555 4717 scope.go:117] "RemoveContainer" containerID="42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.740360 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.756737 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.771762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.772206 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-log" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772227 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-log" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.772250 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="dnsmasq-dns" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772257 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="dnsmasq-dns" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.772266 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-metadata" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772272 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-metadata" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.772294 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ac0da5-7bbd-420a-b56d-60c621244d30" containerName="nova-manage" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772299 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ac0da5-7bbd-420a-b56d-60c621244d30" containerName="nova-manage" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.772308 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="init" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772315 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="init" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772499 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" containerName="dnsmasq-dns" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772517 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ac0da5-7bbd-420a-b56d-60c621244d30" containerName="nova-manage" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772528 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-log" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.772539 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" containerName="nova-metadata-metadata" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.773554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.776051 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.776957 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.792563 4717 scope.go:117] "RemoveContainer" containerID="a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.793213 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60\": container with ID starting with a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60 not found: ID does not exist" containerID="a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.793245 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60"} err="failed to get container status \"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60\": rpc error: code = NotFound desc = could not find container \"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60\": container with ID starting with a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60 not found: ID does not exist" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.793266 4717 scope.go:117] "RemoveContainer" containerID="42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" Mar 08 05:50:19 crc kubenswrapper[4717]: E0308 05:50:19.793677 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef\": container with ID starting with 42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef not found: ID does not exist" containerID="42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.793724 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef"} err="failed to get container status \"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef\": rpc error: code = NotFound desc = could not find container \"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef\": container with ID starting with 42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef not found: ID does not exist" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.793740 4717 scope.go:117] "RemoveContainer" containerID="a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.794027 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60"} err="failed to get container status \"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60\": rpc error: code = NotFound desc = could not find container \"a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60\": container with ID starting with a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60 not found: ID does not exist" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.794066 4717 scope.go:117] "RemoveContainer" containerID="42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.794624 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef"} err="failed to get container status \"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef\": rpc error: code = NotFound desc = could not find container \"42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef\": container with ID starting with 42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef not found: ID does not exist" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.804573 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e72e63-490d-46aa-b4ff-68e33f7def1c" path="/var/lib/kubelet/pods/41e72e63-490d-46aa-b4ff-68e33f7def1c/volumes" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.805198 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ee8450-26f4-4689-ad41-09426d65b6ff" path="/var/lib/kubelet/pods/95ee8450-26f4-4689-ad41-09426d65b6ff/volumes" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.805747 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.867018 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.867158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnqdr\" (UniqueName: \"kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.867262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.867305 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.867348 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.969138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.969979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.970136 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.970459 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.970746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnqdr\" (UniqueName: \"kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.971284 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.975170 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.977723 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.979264 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:19 crc kubenswrapper[4717]: I0308 05:50:19.998222 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnqdr\" (UniqueName: \"kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr\") pod \"nova-metadata-0\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " pod="openstack/nova-metadata-0" Mar 08 05:50:20 crc kubenswrapper[4717]: I0308 05:50:20.098033 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:50:20 crc kubenswrapper[4717]: I0308 05:50:20.638271 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:50:20 crc kubenswrapper[4717]: I0308 05:50:20.658073 4717 generic.go:334] "Generic (PLEG): container finished" podID="84a798f5-2296-45b1-ad1e-5d31f85c67d3" containerID="0a66c7f0fe537f6e65ae7177f15bb0f6afa0ed616a621d68f28485607d40f336" exitCode=0 Mar 08 05:50:20 crc kubenswrapper[4717]: I0308 05:50:20.658147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" event={"ID":"84a798f5-2296-45b1-ad1e-5d31f85c67d3","Type":"ContainerDied","Data":"0a66c7f0fe537f6e65ae7177f15bb0f6afa0ed616a621d68f28485607d40f336"} Mar 08 05:50:20 crc kubenswrapper[4717]: I0308 05:50:20.666696 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" containerName="nova-scheduler-scheduler" containerID="cri-o://3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" gracePeriod=30 Mar 08 05:50:21 crc kubenswrapper[4717]: I0308 05:50:21.683849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerStarted","Data":"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889"} Mar 08 05:50:21 crc kubenswrapper[4717]: I0308 05:50:21.684393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerStarted","Data":"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96"} Mar 08 05:50:21 crc kubenswrapper[4717]: I0308 05:50:21.684408 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerStarted","Data":"aa84f1020c7b31622ad4f9eb958c8164cfd8ea66338a54a6074925eb4f8c7c3e"} Mar 08 05:50:21 crc kubenswrapper[4717]: I0308 05:50:21.723497 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.723482308 podStartE2EDuration="2.723482308s" podCreationTimestamp="2026-03-08 05:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:21.718458514 +0000 UTC m=+1448.636107438" watchObservedRunningTime="2026-03-08 05:50:21.723482308 +0000 UTC m=+1448.641131162" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.108633 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.229471 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data\") pod \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.229672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts\") pod \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.229748 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle\") pod \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.229798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fshgw\" (UniqueName: \"kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw\") pod \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\" (UID: \"84a798f5-2296-45b1-ad1e-5d31f85c67d3\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.259876 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts" (OuterVolumeSpecName: "scripts") pod "84a798f5-2296-45b1-ad1e-5d31f85c67d3" (UID: "84a798f5-2296-45b1-ad1e-5d31f85c67d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.261833 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw" (OuterVolumeSpecName: "kube-api-access-fshgw") pod "84a798f5-2296-45b1-ad1e-5d31f85c67d3" (UID: "84a798f5-2296-45b1-ad1e-5d31f85c67d3"). InnerVolumeSpecName "kube-api-access-fshgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.291789 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84a798f5-2296-45b1-ad1e-5d31f85c67d3" (UID: "84a798f5-2296-45b1-ad1e-5d31f85c67d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.296993 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data" (OuterVolumeSpecName: "config-data") pod "84a798f5-2296-45b1-ad1e-5d31f85c67d3" (UID: "84a798f5-2296-45b1-ad1e-5d31f85c67d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.332386 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.332439 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.332453 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a798f5-2296-45b1-ad1e-5d31f85c67d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.332468 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fshgw\" (UniqueName: \"kubernetes.io/projected/84a798f5-2296-45b1-ad1e-5d31f85c67d3-kube-api-access-fshgw\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.391666 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.535519 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs\") pod \"d96238f0-f729-41c0-8505-952b39cc7ca9\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.535795 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data\") pod \"d96238f0-f729-41c0-8505-952b39cc7ca9\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.535893 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2vdt\" (UniqueName: \"kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt\") pod \"d96238f0-f729-41c0-8505-952b39cc7ca9\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.536128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle\") pod \"d96238f0-f729-41c0-8505-952b39cc7ca9\" (UID: \"d96238f0-f729-41c0-8505-952b39cc7ca9\") " Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.536463 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs" (OuterVolumeSpecName: "logs") pod "d96238f0-f729-41c0-8505-952b39cc7ca9" (UID: "d96238f0-f729-41c0-8505-952b39cc7ca9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.537228 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d96238f0-f729-41c0-8505-952b39cc7ca9-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.541424 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt" (OuterVolumeSpecName: "kube-api-access-h2vdt") pod "d96238f0-f729-41c0-8505-952b39cc7ca9" (UID: "d96238f0-f729-41c0-8505-952b39cc7ca9"). InnerVolumeSpecName "kube-api-access-h2vdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.589746 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data" (OuterVolumeSpecName: "config-data") pod "d96238f0-f729-41c0-8505-952b39cc7ca9" (UID: "d96238f0-f729-41c0-8505-952b39cc7ca9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.591289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d96238f0-f729-41c0-8505-952b39cc7ca9" (UID: "d96238f0-f729-41c0-8505-952b39cc7ca9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.639666 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.639754 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d96238f0-f729-41c0-8505-952b39cc7ca9-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.639781 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2vdt\" (UniqueName: \"kubernetes.io/projected/d96238f0-f729-41c0-8505-952b39cc7ca9-kube-api-access-h2vdt\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.697717 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.697661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s2nrg" event={"ID":"84a798f5-2296-45b1-ad1e-5d31f85c67d3","Type":"ContainerDied","Data":"396d439933a1bc5bf65eb9e92af0bd7782f39154f8b36c54955211a2aafa1de5"} Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.697946 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396d439933a1bc5bf65eb9e92af0bd7782f39154f8b36c54955211a2aafa1de5" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.701213 4717 generic.go:334] "Generic (PLEG): container finished" podID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerID="edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138" exitCode=0 Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.701350 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.701413 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerDied","Data":"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138"} Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.701456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d96238f0-f729-41c0-8505-952b39cc7ca9","Type":"ContainerDied","Data":"3d485895efb0093ad7e0d61def4aa1cb638e82c48c8337cf2432996207f39ac8"} Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.701486 4717 scope.go:117] "RemoveContainer" containerID="edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.739994 4717 scope.go:117] "RemoveContainer" containerID="5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.774971 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.784839 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.788539 4717 scope.go:117] "RemoveContainer" containerID="edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138" Mar 08 05:50:22 crc kubenswrapper[4717]: E0308 05:50:22.789998 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138\": container with ID starting with edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138 not found: ID does not exist" containerID="edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.790060 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138"} err="failed to get container status \"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138\": rpc error: code = NotFound desc = could not find container \"edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138\": container with ID starting with edee9cdc90a5d60e5b153ad3192b8afd07132fb995a525e8a73cd1f6e9a6f138 not found: ID does not exist" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.790093 4717 scope.go:117] "RemoveContainer" containerID="5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee" Mar 08 05:50:22 crc kubenswrapper[4717]: E0308 05:50:22.793159 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee\": container with ID starting with 5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee not found: ID does not exist" containerID="5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.793234 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee"} err="failed to get container status \"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee\": rpc error: code = NotFound desc = could not find container \"5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee\": container with ID starting with 5b614ed2889661b95ad1efc893506b0dff8aa71ac56de278e1f0517da30992ee not found: ID does not exist" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.797730 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: E0308 05:50:22.798459 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a798f5-2296-45b1-ad1e-5d31f85c67d3" containerName="nova-cell1-conductor-db-sync" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.798493 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a798f5-2296-45b1-ad1e-5d31f85c67d3" containerName="nova-cell1-conductor-db-sync" Mar 08 05:50:22 crc kubenswrapper[4717]: E0308 05:50:22.798538 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-log" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.798553 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-log" Mar 08 05:50:22 crc kubenswrapper[4717]: E0308 05:50:22.798601 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-api" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.798617 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-api" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.799053 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-log" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.799084 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a798f5-2296-45b1-ad1e-5d31f85c67d3" containerName="nova-cell1-conductor-db-sync" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.799132 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" containerName="nova-api-api" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.800350 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.805114 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.809943 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.822212 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.825498 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.827278 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.855530 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.947706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.947764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.947789 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.947826 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmngp\" (UniqueName: \"kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.948078 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.948371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6rn\" (UniqueName: \"kubernetes.io/projected/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-kube-api-access-fw6rn\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:22 crc kubenswrapper[4717]: I0308 05:50:22.948452 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.050903 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6rn\" (UniqueName: \"kubernetes.io/projected/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-kube-api-access-fw6rn\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.050986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.051047 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.051074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.051093 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.051117 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmngp\" (UniqueName: \"kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.051162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.052080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.055334 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.056952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.058320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.059181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.072276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6rn\" (UniqueName: \"kubernetes.io/projected/65a7fb6f-308f-468e-8e2c-7adc53b2eb15-kube-api-access-fw6rn\") pod \"nova-cell1-conductor-0\" (UID: \"65a7fb6f-308f-468e-8e2c-7adc53b2eb15\") " pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.078286 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmngp\" (UniqueName: \"kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp\") pod \"nova-api-0\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.129555 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.149792 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:23 crc kubenswrapper[4717]: E0308 05:50:23.295392 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 is running failed: container process not found" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 05:50:23 crc kubenswrapper[4717]: E0308 05:50:23.297914 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 is running failed: container process not found" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 05:50:23 crc kubenswrapper[4717]: E0308 05:50:23.299199 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 is running failed: container process not found" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 05:50:23 crc kubenswrapper[4717]: E0308 05:50:23.299252 4717 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" containerName="nova-scheduler-scheduler" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.704402 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.730643 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e1eab09-61ae-43df-9173-7267107e9f24" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" exitCode=0 Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.730773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e1eab09-61ae-43df-9173-7267107e9f24","Type":"ContainerDied","Data":"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595"} Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.730857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e1eab09-61ae-43df-9173-7267107e9f24","Type":"ContainerDied","Data":"9db180491c91035e48b830586a5b562c3f31e6a5b2768b448d6411543b3fdcf2"} Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.730882 4717 scope.go:117] "RemoveContainer" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.730807 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.734996 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.771843 4717 scope.go:117] "RemoveContainer" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" Mar 08 05:50:23 crc kubenswrapper[4717]: E0308 05:50:23.772453 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595\": container with ID starting with 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 not found: ID does not exist" containerID="3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.772503 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595"} err="failed to get container status \"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595\": rpc error: code = NotFound desc = could not find container \"3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595\": container with ID starting with 3f279eb7ce3f21e77341997bb18f18efdc1febceea61dc561e75904962012595 not found: ID does not exist" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.798727 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96238f0-f729-41c0-8505-952b39cc7ca9" path="/var/lib/kubelet/pods/d96238f0-f729-41c0-8505-952b39cc7ca9/volumes" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.848648 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.875403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle\") pod \"4e1eab09-61ae-43df-9173-7267107e9f24\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.875439 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data\") pod \"4e1eab09-61ae-43df-9173-7267107e9f24\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.875513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cml7k\" (UniqueName: \"kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k\") pod \"4e1eab09-61ae-43df-9173-7267107e9f24\" (UID: \"4e1eab09-61ae-43df-9173-7267107e9f24\") " Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.879613 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k" (OuterVolumeSpecName: "kube-api-access-cml7k") pod "4e1eab09-61ae-43df-9173-7267107e9f24" (UID: "4e1eab09-61ae-43df-9173-7267107e9f24"). InnerVolumeSpecName "kube-api-access-cml7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.912115 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data" (OuterVolumeSpecName: "config-data") pod "4e1eab09-61ae-43df-9173-7267107e9f24" (UID: "4e1eab09-61ae-43df-9173-7267107e9f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.912732 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e1eab09-61ae-43df-9173-7267107e9f24" (UID: "4e1eab09-61ae-43df-9173-7267107e9f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.978141 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.978185 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1eab09-61ae-43df-9173-7267107e9f24-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:23 crc kubenswrapper[4717]: I0308 05:50:23.978198 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cml7k\" (UniqueName: \"kubernetes.io/projected/4e1eab09-61ae-43df-9173-7267107e9f24-kube-api-access-cml7k\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.200252 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.211431 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.238201 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:24 crc kubenswrapper[4717]: E0308 05:50:24.238655 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" containerName="nova-scheduler-scheduler" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.238666 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" containerName="nova-scheduler-scheduler" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.238936 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" containerName="nova-scheduler-scheduler" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.239588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.245141 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.255877 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.386809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.386989 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.387068 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhf48\" (UniqueName: \"kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.489115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.489174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf48\" (UniqueName: \"kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.489296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.493621 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.503860 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.518195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhf48\" (UniqueName: \"kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48\") pod \"nova-scheduler-0\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.561629 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.749525 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"65a7fb6f-308f-468e-8e2c-7adc53b2eb15","Type":"ContainerStarted","Data":"66c2194f81b99eb3bad37f00b7388f41be0dfd64ad584e37215d34aea2440726"} Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.749572 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"65a7fb6f-308f-468e-8e2c-7adc53b2eb15","Type":"ContainerStarted","Data":"6053a2b3d13d9fd8c7a4b6bd382ea5e7a11c281c01ba6efe10734d54a70a408e"} Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.749674 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.753178 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerStarted","Data":"b45b86c0fa28d23218a2cf70be91d8d9208e1bce0d0cec38d3f7b2c1b1c8c133"} Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.753228 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerStarted","Data":"7637a5cfab9de0e7ef94636c20648eba3856a18a6c4293f9a5d38ab1d76415b4"} Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.753241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerStarted","Data":"581b9b5c1c451bf5fbbb46328a3fa4d8372ad9be288d3931c9ccd81f1e18503a"} Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.769750 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.76973623 podStartE2EDuration="2.76973623s" podCreationTimestamp="2026-03-08 05:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:24.767797962 +0000 UTC m=+1451.685446806" watchObservedRunningTime="2026-03-08 05:50:24.76973623 +0000 UTC m=+1451.687385074" Mar 08 05:50:24 crc kubenswrapper[4717]: I0308 05:50:24.795471 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.795450996 podStartE2EDuration="2.795450996s" podCreationTimestamp="2026-03-08 05:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:24.783646974 +0000 UTC m=+1451.701295828" watchObservedRunningTime="2026-03-08 05:50:24.795450996 +0000 UTC m=+1451.713099840" Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.027071 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.098723 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.098790 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.637520 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.777792 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42fad031-16bc-4641-9f97-3b89351d0b89","Type":"ContainerStarted","Data":"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92"} Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.777860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42fad031-16bc-4641-9f97-3b89351d0b89","Type":"ContainerStarted","Data":"accbe5aadc8e7f569d48d142f83cad12921b6db818eebd64dc8e31eb0ef57158"} Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.798036 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.798017099 podStartE2EDuration="1.798017099s" podCreationTimestamp="2026-03-08 05:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:25.793658051 +0000 UTC m=+1452.711306935" watchObservedRunningTime="2026-03-08 05:50:25.798017099 +0000 UTC m=+1452.715665943" Mar 08 05:50:25 crc kubenswrapper[4717]: I0308 05:50:25.858754 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1eab09-61ae-43df-9173-7267107e9f24" path="/var/lib/kubelet/pods/4e1eab09-61ae-43df-9173-7267107e9f24/volumes" Mar 08 05:50:26 crc kubenswrapper[4717]: I0308 05:50:26.024805 4717 scope.go:117] "RemoveContainer" containerID="cf3fe648c4bcc07f3b254ce82882c1fbe0e5a130e3f05c082e588fda45b0dcfb" Mar 08 05:50:28 crc kubenswrapper[4717]: I0308 05:50:28.167085 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 05:50:29 crc kubenswrapper[4717]: I0308 05:50:29.561847 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 05:50:29 crc kubenswrapper[4717]: I0308 05:50:29.599036 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:29 crc kubenswrapper[4717]: I0308 05:50:29.599286 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="52a5a330-a048-48ff-b195-fc897299b500" containerName="kube-state-metrics" containerID="cri-o://8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd" gracePeriod=30 Mar 08 05:50:29 crc kubenswrapper[4717]: I0308 05:50:29.835096 4717 generic.go:334] "Generic (PLEG): container finished" podID="52a5a330-a048-48ff-b195-fc897299b500" containerID="8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd" exitCode=2 Mar 08 05:50:29 crc kubenswrapper[4717]: I0308 05:50:29.835156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a5a330-a048-48ff-b195-fc897299b500","Type":"ContainerDied","Data":"8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd"} Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.099403 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.101947 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.190909 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.342617 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpp5t\" (UniqueName: \"kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t\") pod \"52a5a330-a048-48ff-b195-fc897299b500\" (UID: \"52a5a330-a048-48ff-b195-fc897299b500\") " Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.351855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t" (OuterVolumeSpecName: "kube-api-access-tpp5t") pod "52a5a330-a048-48ff-b195-fc897299b500" (UID: "52a5a330-a048-48ff-b195-fc897299b500"). InnerVolumeSpecName "kube-api-access-tpp5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.444901 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpp5t\" (UniqueName: \"kubernetes.io/projected/52a5a330-a048-48ff-b195-fc897299b500-kube-api-access-tpp5t\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.849199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"52a5a330-a048-48ff-b195-fc897299b500","Type":"ContainerDied","Data":"7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870"} Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.849228 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.849281 4717 scope.go:117] "RemoveContainer" containerID="8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.947592 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.971645 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.986736 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:30 crc kubenswrapper[4717]: E0308 05:50:30.987227 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a5a330-a048-48ff-b195-fc897299b500" containerName="kube-state-metrics" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.987245 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a5a330-a048-48ff-b195-fc897299b500" containerName="kube-state-metrics" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.987424 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a5a330-a048-48ff-b195-fc897299b500" containerName="kube-state-metrics" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.988142 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.996415 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 08 05:50:30 crc kubenswrapper[4717]: I0308 05:50:30.996800 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.009461 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.110852 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.110882 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.158172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.158529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.158712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.158781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-api-access-fvmgn\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.260744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.260808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.260832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-api-access-fvmgn\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.260940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.265585 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.267300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.276981 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.287547 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/99fa4aa1-7000-4df4-8c35-d9bf87df65f3-kube-api-access-fvmgn\") pod \"kube-state-metrics-0\" (UID: \"99fa4aa1-7000-4df4-8c35-d9bf87df65f3\") " pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.315590 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.672132 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.672711 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-central-agent" containerID="cri-o://d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438" gracePeriod=30 Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.672787 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="sg-core" containerID="cri-o://bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac" gracePeriod=30 Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.672824 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-notification-agent" containerID="cri-o://eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3" gracePeriod=30 Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.672864 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="proxy-httpd" containerID="cri-o://5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835" gracePeriod=30 Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.796354 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a5a330-a048-48ff-b195-fc897299b500" path="/var/lib/kubelet/pods/52a5a330-a048-48ff-b195-fc897299b500/volumes" Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.814770 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.860842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99fa4aa1-7000-4df4-8c35-d9bf87df65f3","Type":"ContainerStarted","Data":"d8eb340d6319507557cfbacf720ccec5ea2f07ca2f671a8958c8bb8bc07c7705"} Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.863283 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerID="bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac" exitCode=2 Mar 08 05:50:31 crc kubenswrapper[4717]: I0308 05:50:31.863304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerDied","Data":"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac"} Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.878574 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99fa4aa1-7000-4df4-8c35-d9bf87df65f3","Type":"ContainerStarted","Data":"bb5ccb64697d4af819180c1b4fb4b51f51b2d04cfb002b60aa99a0e1920a197b"} Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.879034 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.883012 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerID="5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835" exitCode=0 Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.883051 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerID="d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438" exitCode=0 Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.883064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerDied","Data":"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835"} Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.883129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerDied","Data":"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438"} Mar 08 05:50:32 crc kubenswrapper[4717]: I0308 05:50:32.920409 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.551556291 podStartE2EDuration="2.920379371s" podCreationTimestamp="2026-03-08 05:50:30 +0000 UTC" firstStartedPulling="2026-03-08 05:50:31.850305839 +0000 UTC m=+1458.767954673" lastFinishedPulling="2026-03-08 05:50:32.219128899 +0000 UTC m=+1459.136777753" observedRunningTime="2026-03-08 05:50:32.907639656 +0000 UTC m=+1459.825288510" watchObservedRunningTime="2026-03-08 05:50:32.920379371 +0000 UTC m=+1459.838028255" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.150379 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.153923 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.484597 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611260 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611420 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611513 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611569 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611670 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtnrb\" (UniqueName: \"kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611719 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.611751 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle\") pod \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\" (UID: \"8d377841-4cb0-42fb-b8e6-5fca1f0263a4\") " Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.613306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.613649 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.617286 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb" (OuterVolumeSpecName: "kube-api-access-qtnrb") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "kube-api-access-qtnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.620776 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts" (OuterVolumeSpecName: "scripts") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.641755 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.688995 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717383 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtnrb\" (UniqueName: \"kubernetes.io/projected/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-kube-api-access-qtnrb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717417 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717429 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717441 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717452 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.717463 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.720575 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data" (OuterVolumeSpecName: "config-data") pod "8d377841-4cb0-42fb-b8e6-5fca1f0263a4" (UID: "8d377841-4cb0-42fb-b8e6-5fca1f0263a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.821864 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d377841-4cb0-42fb-b8e6-5fca1f0263a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.908244 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerID="eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3" exitCode=0 Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.908312 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerDied","Data":"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3"} Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.908370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d377841-4cb0-42fb-b8e6-5fca1f0263a4","Type":"ContainerDied","Data":"fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc"} Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.908393 4717 scope.go:117] "RemoveContainer" containerID="5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.909256 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.940670 4717 scope.go:117] "RemoveContainer" containerID="bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.945865 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.959915 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.966564 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:33 crc kubenswrapper[4717]: E0308 05:50:33.967098 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-central-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.968802 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-central-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: E0308 05:50:33.968935 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-notification-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.968990 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-notification-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: E0308 05:50:33.969054 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="sg-core" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969108 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="sg-core" Mar 08 05:50:33 crc kubenswrapper[4717]: E0308 05:50:33.969161 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="proxy-httpd" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969212 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="proxy-httpd" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969483 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-central-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969618 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="proxy-httpd" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969667 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="ceilometer-notification-agent" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.969763 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" containerName="sg-core" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.970243 4717 scope.go:117] "RemoveContainer" containerID="eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.992395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.996368 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.996540 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.996775 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:50:33 crc kubenswrapper[4717]: I0308 05:50:33.998294 4717 scope.go:117] "RemoveContainer" containerID="d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:33.998871 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.029258 4717 scope.go:117] "RemoveContainer" containerID="5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835" Mar 08 05:50:34 crc kubenswrapper[4717]: E0308 05:50:34.030382 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835\": container with ID starting with 5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835 not found: ID does not exist" containerID="5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.030505 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835"} err="failed to get container status \"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835\": rpc error: code = NotFound desc = could not find container \"5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835\": container with ID starting with 5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835 not found: ID does not exist" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.030534 4717 scope.go:117] "RemoveContainer" containerID="bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac" Mar 08 05:50:34 crc kubenswrapper[4717]: E0308 05:50:34.030798 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac\": container with ID starting with bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac not found: ID does not exist" containerID="bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.030822 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac"} err="failed to get container status \"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac\": rpc error: code = NotFound desc = could not find container \"bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac\": container with ID starting with bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac not found: ID does not exist" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.030836 4717 scope.go:117] "RemoveContainer" containerID="eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3" Mar 08 05:50:34 crc kubenswrapper[4717]: E0308 05:50:34.031126 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3\": container with ID starting with eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3 not found: ID does not exist" containerID="eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.031147 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3"} err="failed to get container status \"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3\": rpc error: code = NotFound desc = could not find container \"eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3\": container with ID starting with eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3 not found: ID does not exist" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.031159 4717 scope.go:117] "RemoveContainer" containerID="d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438" Mar 08 05:50:34 crc kubenswrapper[4717]: E0308 05:50:34.031466 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438\": container with ID starting with d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438 not found: ID does not exist" containerID="d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.031485 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438"} err="failed to get container status \"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438\": rpc error: code = NotFound desc = could not find container \"d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438\": container with ID starting with d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438 not found: ID does not exist" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.128354 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.128574 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.128931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4xp\" (UniqueName: \"kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.129028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.129564 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.129720 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.129776 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.129840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.232615 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4xp\" (UniqueName: \"kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.232823 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.232877 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.232939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.232937 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233069 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.233564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.234213 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.239246 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.239356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.240165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.254803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.255506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.264870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4xp\" (UniqueName: \"kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp\") pod \"ceilometer-0\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.310698 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.562023 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.597241 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.819241 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.931164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerStarted","Data":"f3e992eb85be83371b44dcbfc2f9194b33d4f966c91d1f71c1090f73fdf47076"} Mar 08 05:50:34 crc kubenswrapper[4717]: I0308 05:50:34.976794 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 05:50:35 crc kubenswrapper[4717]: I0308 05:50:35.793083 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d377841-4cb0-42fb-b8e6-5fca1f0263a4" path="/var/lib/kubelet/pods/8d377841-4cb0-42fb-b8e6-5fca1f0263a4/volumes" Mar 08 05:50:35 crc kubenswrapper[4717]: I0308 05:50:35.943453 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerStarted","Data":"2e61ff8627c042942b55f2641333147dc5c55f3b996cf2f9b333525379bce7aa"} Mar 08 05:50:35 crc kubenswrapper[4717]: I0308 05:50:35.943492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerStarted","Data":"8daeaf0236f22615e0839234098b971ac435807b9f6a5e745b76d0129a9004a7"} Mar 08 05:50:36 crc kubenswrapper[4717]: I0308 05:50:36.971870 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerStarted","Data":"65725c57706ec9f467748c5886df75fa5f900bbcb919d4848a0f175d8fb8784f"} Mar 08 05:50:38 crc kubenswrapper[4717]: E0308 05:50:38.140496 4717 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/e5ef4b5b561a98833819cf7d1b0e3c989a8103c9a2d23c7a2fc27da3fdd0b9d0/diff" to get inode usage: stat /var/lib/containers/storage/overlay/e5ef4b5b561a98833819cf7d1b0e3c989a8103c9a2d23c7a2fc27da3fdd0b9d0/diff: no such file or directory, extraDiskErr: Mar 08 05:50:39 crc kubenswrapper[4717]: I0308 05:50:38.999674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerStarted","Data":"74fc138fd4a14a2d804d81e65589f281b663e7443c21c6da431cd7a72878767c"} Mar 08 05:50:39 crc kubenswrapper[4717]: I0308 05:50:39.000441 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:50:39 crc kubenswrapper[4717]: I0308 05:50:39.037178 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.62124911 podStartE2EDuration="6.037159625s" podCreationTimestamp="2026-03-08 05:50:33 +0000 UTC" firstStartedPulling="2026-03-08 05:50:34.827483872 +0000 UTC m=+1461.745132706" lastFinishedPulling="2026-03-08 05:50:38.243394347 +0000 UTC m=+1465.161043221" observedRunningTime="2026-03-08 05:50:39.028194794 +0000 UTC m=+1465.945843678" watchObservedRunningTime="2026-03-08 05:50:39.037159625 +0000 UTC m=+1465.954808469" Mar 08 05:50:39 crc kubenswrapper[4717]: E0308 05:50:39.377238 4717 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/bd09f9478f1e7e7fafa64d1fb3199bd612a42cecd90d9c89b323c44f4ade7e63/diff" to get inode usage: stat /var/lib/containers/storage/overlay/bd09f9478f1e7e7fafa64d1fb3199bd612a42cecd90d9c89b323c44f4ade7e63/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-6fdf58bb7c-v2hj5_41e72e63-490d-46aa-b4ff-68e33f7def1c/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-6fdf58bb7c-v2hj5_41e72e63-490d-46aa-b4ff-68e33f7def1c/dnsmasq-dns/0.log: no such file or directory Mar 08 05:50:40 crc kubenswrapper[4717]: I0308 05:50:40.111787 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 05:50:40 crc kubenswrapper[4717]: I0308 05:50:40.113376 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 05:50:40 crc kubenswrapper[4717]: I0308 05:50:40.137834 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 05:50:41 crc kubenswrapper[4717]: I0308 05:50:41.034081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 05:50:41 crc kubenswrapper[4717]: I0308 05:50:41.331432 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 05:50:42 crc kubenswrapper[4717]: W0308 05:50:42.576379 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ee8450_26f4_4689_ad41_09426d65b6ff.slice/crio-492dcd550d336145aab5cf7fa4e69ecc1cade9104906428d8657a5d0d1878d61 WatchSource:0}: Error finding container 492dcd550d336145aab5cf7fa4e69ecc1cade9104906428d8657a5d0d1878d61: Status 404 returned error can't find the container with id 492dcd550d336145aab5cf7fa4e69ecc1cade9104906428d8657a5d0d1878d61 Mar 08 05:50:42 crc kubenswrapper[4717]: W0308 05:50:42.577218 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ee8450_26f4_4689_ad41_09426d65b6ff.slice/crio-42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef.scope WatchSource:0}: Error finding container 42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef: Status 404 returned error can't find the container with id 42cf4481c4095c1140a90a34c2f8f42eb3e23e9aa956cc1bd5062d27d399fdef Mar 08 05:50:42 crc kubenswrapper[4717]: W0308 05:50:42.577566 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ee8450_26f4_4689_ad41_09426d65b6ff.slice/crio-a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60.scope WatchSource:0}: Error finding container a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60: Status 404 returned error can't find the container with id a1590443a86ceabf95347b28a76ad5ba0cfe31000b45f06734b786563ada2c60 Mar 08 05:50:42 crc kubenswrapper[4717]: E0308 05:50:42.847234 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a5a330_a048_48ff_b195_fc897299b500.slice/crio-conmon-8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda867b873_1e1b_4104_ae6d_34f2d216c3ca.slice/crio-conmon-b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a5a330_a048_48ff_b195_fc897299b500.slice/crio-7d789d5e1db6d4e497c79705b0cd83a1156894935fae821177e1c1ce97cb2870\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-fb6f55e48ccb911ed71bbc3714fd88f1486751db07c507332eb7c5c244d307bc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-conmon-bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-conmon-eab6a293e8018e8107a9b8ad162993e148c0414e1c8415e9cd20a6df4a852ce3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-conmon-d4292864a1575ea61893601ad8db181f907145a307cc9082c0304a84d4b40438.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-bcdd75819a759ec676f0968a354657d30e279cdd86620c4d7403eb6ef8fb57ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a5a330_a048_48ff_b195_fc897299b500.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice/crio-conmon-5e88f8baf0eb4d4567f3eb4a722ed441a937ac095aef6aea50f412a8a0a6e835.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52a5a330_a048_48ff_b195_fc897299b500.slice/crio-8881b8b300a170a14020d36cc4145be080a6558a327f34b0f8333b9bdad7fafd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d377841_4cb0_42fb_b8e6_5fca1f0263a4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda867b873_1e1b_4104_ae6d_34f2d216c3ca.slice/crio-b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce.scope\": RecentStats: unable to find data in memory cache]" Mar 08 05:50:42 crc kubenswrapper[4717]: I0308 05:50:42.992895 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.048942 4717 generic.go:334] "Generic (PLEG): container finished" podID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" containerID="b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce" exitCode=137 Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.048981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a867b873-1e1b-4104-ae6d-34f2d216c3ca","Type":"ContainerDied","Data":"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce"} Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.049033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a867b873-1e1b-4104-ae6d-34f2d216c3ca","Type":"ContainerDied","Data":"c5a23b6ca3aad1d6d28665b167ba150ac248ed895d60add99e15fd1623c098f8"} Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.049055 4717 scope.go:117] "RemoveContainer" containerID="b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.049006 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.050394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data\") pod \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.050502 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle\") pod \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.050575 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5r7t\" (UniqueName: \"kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t\") pod \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\" (UID: \"a867b873-1e1b-4104-ae6d-34f2d216c3ca\") " Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.067027 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t" (OuterVolumeSpecName: "kube-api-access-q5r7t") pod "a867b873-1e1b-4104-ae6d-34f2d216c3ca" (UID: "a867b873-1e1b-4104-ae6d-34f2d216c3ca"). InnerVolumeSpecName "kube-api-access-q5r7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.077847 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data" (OuterVolumeSpecName: "config-data") pod "a867b873-1e1b-4104-ae6d-34f2d216c3ca" (UID: "a867b873-1e1b-4104-ae6d-34f2d216c3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.108319 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a867b873-1e1b-4104-ae6d-34f2d216c3ca" (UID: "a867b873-1e1b-4104-ae6d-34f2d216c3ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.152843 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.152878 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5r7t\" (UniqueName: \"kubernetes.io/projected/a867b873-1e1b-4104-ae6d-34f2d216c3ca-kube-api-access-q5r7t\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.152894 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a867b873-1e1b-4104-ae6d-34f2d216c3ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.171671 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.172127 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.173489 4717 scope.go:117] "RemoveContainer" containerID="b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.177104 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 05:50:43 crc kubenswrapper[4717]: E0308 05:50:43.180091 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce\": container with ID starting with b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce not found: ID does not exist" containerID="b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.180126 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce"} err="failed to get container status \"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce\": rpc error: code = NotFound desc = could not find container \"b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce\": container with ID starting with b317b77557df12420e9a1500160b243ceb3b8430f8968333c8dd7628bc58b7ce not found: ID does not exist" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.201576 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.381763 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.403568 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.414232 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:43 crc kubenswrapper[4717]: E0308 05:50:43.414703 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.414717 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.414918 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.415590 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.422925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.446368 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.446544 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.447734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.460428 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.460492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.460517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.460909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.460944 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfd9z\" (UniqueName: \"kubernetes.io/projected/bf251d2f-577d-4de2-ac4b-f51dc79add8d-kube-api-access-hfd9z\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.561579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.561643 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.561663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.561761 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.561776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfd9z\" (UniqueName: \"kubernetes.io/projected/bf251d2f-577d-4de2-ac4b-f51dc79add8d-kube-api-access-hfd9z\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.565896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.566557 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.567399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.567631 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf251d2f-577d-4de2-ac4b-f51dc79add8d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.586005 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfd9z\" (UniqueName: \"kubernetes.io/projected/bf251d2f-577d-4de2-ac4b-f51dc79add8d-kube-api-access-hfd9z\") pod \"nova-cell1-novncproxy-0\" (UID: \"bf251d2f-577d-4de2-ac4b-f51dc79add8d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.757331 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:43 crc kubenswrapper[4717]: I0308 05:50:43.802505 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a867b873-1e1b-4104-ae6d-34f2d216c3ca" path="/var/lib/kubelet/pods/a867b873-1e1b-4104-ae6d-34f2d216c3ca/volumes" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.058388 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.064711 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.239383 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.243141 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.254063 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.275449 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 05:50:44 crc kubenswrapper[4717]: W0308 05:50:44.278486 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf251d2f_577d_4de2_ac4b_f51dc79add8d.slice/crio-e07ec4b5851a12ea0738f03f9ba5c5856d82782d25aa14b3ba4523a77c3c4f00 WatchSource:0}: Error finding container e07ec4b5851a12ea0738f03f9ba5c5856d82782d25aa14b3ba4523a77c3c4f00: Status 404 returned error can't find the container with id e07ec4b5851a12ea0738f03f9ba5c5856d82782d25aa14b3ba4523a77c3c4f00 Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377667 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.377817 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkn5\" (UniqueName: \"kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.479557 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.479741 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.479813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.479878 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.479991 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.480019 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkn5\" (UniqueName: \"kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.480402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.480734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.480760 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.481051 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.481311 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.497273 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkn5\" (UniqueName: \"kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5\") pod \"dnsmasq-dns-59b79f7d5c-dwvdr\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:44 crc kubenswrapper[4717]: I0308 05:50:44.559103 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:45 crc kubenswrapper[4717]: I0308 05:50:45.066421 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bf251d2f-577d-4de2-ac4b-f51dc79add8d","Type":"ContainerStarted","Data":"7838c2e156085f58828a66cd19142dd51d1c4f415c0759ead58b1f4dc932958c"} Mar 08 05:50:45 crc kubenswrapper[4717]: I0308 05:50:45.066718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bf251d2f-577d-4de2-ac4b-f51dc79add8d","Type":"ContainerStarted","Data":"e07ec4b5851a12ea0738f03f9ba5c5856d82782d25aa14b3ba4523a77c3c4f00"} Mar 08 05:50:45 crc kubenswrapper[4717]: I0308 05:50:45.086437 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.086418841 podStartE2EDuration="2.086418841s" podCreationTimestamp="2026-03-08 05:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:45.081668063 +0000 UTC m=+1471.999316907" watchObservedRunningTime="2026-03-08 05:50:45.086418841 +0000 UTC m=+1472.004067685" Mar 08 05:50:45 crc kubenswrapper[4717]: W0308 05:50:45.133376 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b6e938f_e9f5_42cb_b0b3_3b3b35ae895d.slice/crio-90f1d2a986b2ec1291df5e4452c6c0a31add797e977fa401a4f20d88bcab3340 WatchSource:0}: Error finding container 90f1d2a986b2ec1291df5e4452c6c0a31add797e977fa401a4f20d88bcab3340: Status 404 returned error can't find the container with id 90f1d2a986b2ec1291df5e4452c6c0a31add797e977fa401a4f20d88bcab3340 Mar 08 05:50:45 crc kubenswrapper[4717]: I0308 05:50:45.153534 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:50:46 crc kubenswrapper[4717]: I0308 05:50:46.075026 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerID="b3b6705875ce1f1de5d557a9e807f71e9b7124a221e9c25638f9a843dde230f7" exitCode=0 Mar 08 05:50:46 crc kubenswrapper[4717]: I0308 05:50:46.075878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" event={"ID":"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d","Type":"ContainerDied","Data":"b3b6705875ce1f1de5d557a9e807f71e9b7124a221e9c25638f9a843dde230f7"} Mar 08 05:50:46 crc kubenswrapper[4717]: I0308 05:50:46.075942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" event={"ID":"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d","Type":"ContainerStarted","Data":"90f1d2a986b2ec1291df5e4452c6c0a31add797e977fa401a4f20d88bcab3340"} Mar 08 05:50:46 crc kubenswrapper[4717]: I0308 05:50:46.812221 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.085665 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-log" containerID="cri-o://7637a5cfab9de0e7ef94636c20648eba3856a18a6c4293f9a5d38ab1d76415b4" gracePeriod=30 Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.086484 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-api" containerID="cri-o://b45b86c0fa28d23218a2cf70be91d8d9208e1bce0d0cec38d3f7b2c1b1c8c133" gracePeriod=30 Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.086501 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" event={"ID":"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d","Type":"ContainerStarted","Data":"290f88ff410ade0662248512c1eeba995eac6f021fc98db7ac1ff100faebfd4d"} Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.086820 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.110445 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" podStartSLOduration=3.110429093 podStartE2EDuration="3.110429093s" podCreationTimestamp="2026-03-08 05:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:47.109551781 +0000 UTC m=+1474.027200625" watchObservedRunningTime="2026-03-08 05:50:47.110429093 +0000 UTC m=+1474.028077937" Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.381880 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.388071 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-central-agent" containerID="cri-o://8daeaf0236f22615e0839234098b971ac435807b9f6a5e745b76d0129a9004a7" gracePeriod=30 Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.388574 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="proxy-httpd" containerID="cri-o://74fc138fd4a14a2d804d81e65589f281b663e7443c21c6da431cd7a72878767c" gracePeriod=30 Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.388717 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="sg-core" containerID="cri-o://65725c57706ec9f467748c5886df75fa5f900bbcb919d4848a0f175d8fb8784f" gracePeriod=30 Mar 08 05:50:47 crc kubenswrapper[4717]: I0308 05:50:47.389098 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-notification-agent" containerID="cri-o://2e61ff8627c042942b55f2641333147dc5c55f3b996cf2f9b333525379bce7aa" gracePeriod=30 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097633 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerID="74fc138fd4a14a2d804d81e65589f281b663e7443c21c6da431cd7a72878767c" exitCode=0 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097662 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerID="65725c57706ec9f467748c5886df75fa5f900bbcb919d4848a0f175d8fb8784f" exitCode=2 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097669 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerID="8daeaf0236f22615e0839234098b971ac435807b9f6a5e745b76d0129a9004a7" exitCode=0 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerDied","Data":"74fc138fd4a14a2d804d81e65589f281b663e7443c21c6da431cd7a72878767c"} Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerDied","Data":"65725c57706ec9f467748c5886df75fa5f900bbcb919d4848a0f175d8fb8784f"} Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.097769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerDied","Data":"8daeaf0236f22615e0839234098b971ac435807b9f6a5e745b76d0129a9004a7"} Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.100509 4717 generic.go:334] "Generic (PLEG): container finished" podID="db411730-378f-4d00-971c-cd18e52fa179" containerID="b45b86c0fa28d23218a2cf70be91d8d9208e1bce0d0cec38d3f7b2c1b1c8c133" exitCode=0 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.100562 4717 generic.go:334] "Generic (PLEG): container finished" podID="db411730-378f-4d00-971c-cd18e52fa179" containerID="7637a5cfab9de0e7ef94636c20648eba3856a18a6c4293f9a5d38ab1d76415b4" exitCode=143 Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.100589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerDied","Data":"b45b86c0fa28d23218a2cf70be91d8d9208e1bce0d0cec38d3f7b2c1b1c8c133"} Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.100651 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerDied","Data":"7637a5cfab9de0e7ef94636c20648eba3856a18a6c4293f9a5d38ab1d76415b4"} Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.466511 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.564777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data\") pod \"db411730-378f-4d00-971c-cd18e52fa179\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.564878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle\") pod \"db411730-378f-4d00-971c-cd18e52fa179\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.564928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs\") pod \"db411730-378f-4d00-971c-cd18e52fa179\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.565087 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmngp\" (UniqueName: \"kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp\") pod \"db411730-378f-4d00-971c-cd18e52fa179\" (UID: \"db411730-378f-4d00-971c-cd18e52fa179\") " Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.565497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs" (OuterVolumeSpecName: "logs") pod "db411730-378f-4d00-971c-cd18e52fa179" (UID: "db411730-378f-4d00-971c-cd18e52fa179"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.592697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp" (OuterVolumeSpecName: "kube-api-access-zmngp") pod "db411730-378f-4d00-971c-cd18e52fa179" (UID: "db411730-378f-4d00-971c-cd18e52fa179"). InnerVolumeSpecName "kube-api-access-zmngp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.599716 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data" (OuterVolumeSpecName: "config-data") pod "db411730-378f-4d00-971c-cd18e52fa179" (UID: "db411730-378f-4d00-971c-cd18e52fa179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.600249 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db411730-378f-4d00-971c-cd18e52fa179" (UID: "db411730-378f-4d00-971c-cd18e52fa179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.667288 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmngp\" (UniqueName: \"kubernetes.io/projected/db411730-378f-4d00-971c-cd18e52fa179-kube-api-access-zmngp\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.667335 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.667344 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db411730-378f-4d00-971c-cd18e52fa179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.667354 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db411730-378f-4d00-971c-cd18e52fa179-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:48 crc kubenswrapper[4717]: I0308 05:50:48.757669 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.115242 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.115235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db411730-378f-4d00-971c-cd18e52fa179","Type":"ContainerDied","Data":"581b9b5c1c451bf5fbbb46328a3fa4d8372ad9be288d3931c9ccd81f1e18503a"} Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.115579 4717 scope.go:117] "RemoveContainer" containerID="b45b86c0fa28d23218a2cf70be91d8d9208e1bce0d0cec38d3f7b2c1b1c8c133" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.122826 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerID="2e61ff8627c042942b55f2641333147dc5c55f3b996cf2f9b333525379bce7aa" exitCode=0 Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.122875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerDied","Data":"2e61ff8627c042942b55f2641333147dc5c55f3b996cf2f9b333525379bce7aa"} Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.145559 4717 scope.go:117] "RemoveContainer" containerID="7637a5cfab9de0e7ef94636c20648eba3856a18a6c4293f9a5d38ab1d76415b4" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.147598 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.160215 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.182568 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:49 crc kubenswrapper[4717]: E0308 05:50:49.182993 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-log" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.183007 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-log" Mar 08 05:50:49 crc kubenswrapper[4717]: E0308 05:50:49.183030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-api" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.183037 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-api" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.183231 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-log" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.183246 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db411730-378f-4d00-971c-cd18e52fa179" containerName="nova-api-api" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.184275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.186202 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.186427 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.186701 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.191127 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.238292 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.279652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.279711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.279736 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.279777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.279841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.280243 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.381791 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.381948 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.381981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4xp\" (UniqueName: \"kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382055 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382181 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382278 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts\") pod \"cdee03e0-31d8-44a1-86e8-92a5b356370d\" (UID: \"cdee03e0-31d8-44a1-86e8-92a5b356370d\") " Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382668 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.382898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.386047 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.386551 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.387395 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.390197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp" (OuterVolumeSpecName: "kube-api-access-rg4xp") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "kube-api-access-rg4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.390558 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts" (OuterVolumeSpecName: "scripts") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.390571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.392804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.399771 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.403888 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc\") pod \"nova-api-0\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.421923 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.466477 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.469375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485095 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485115 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485125 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4xp\" (UniqueName: \"kubernetes.io/projected/cdee03e0-31d8-44a1-86e8-92a5b356370d-kube-api-access-rg4xp\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485133 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485142 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485149 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdee03e0-31d8-44a1-86e8-92a5b356370d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.485156 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.499909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data" (OuterVolumeSpecName: "config-data") pod "cdee03e0-31d8-44a1-86e8-92a5b356370d" (UID: "cdee03e0-31d8-44a1-86e8-92a5b356370d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.530984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.587721 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdee03e0-31d8-44a1-86e8-92a5b356370d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:49 crc kubenswrapper[4717]: I0308 05:50:49.798454 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db411730-378f-4d00-971c-cd18e52fa179" path="/var/lib/kubelet/pods/db411730-378f-4d00-971c-cd18e52fa179/volumes" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.019332 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.146466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdee03e0-31d8-44a1-86e8-92a5b356370d","Type":"ContainerDied","Data":"f3e992eb85be83371b44dcbfc2f9194b33d4f966c91d1f71c1090f73fdf47076"} Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.146508 4717 scope.go:117] "RemoveContainer" containerID="74fc138fd4a14a2d804d81e65589f281b663e7443c21c6da431cd7a72878767c" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.146612 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.153010 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerStarted","Data":"9d58a7d4d0ca1bf15b57d88ce72da85be4568c303195e617f40f23def4ae53be"} Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.170122 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.180351 4717 scope.go:117] "RemoveContainer" containerID="65725c57706ec9f467748c5886df75fa5f900bbcb919d4848a0f175d8fb8784f" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.193437 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.214803 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:50 crc kubenswrapper[4717]: E0308 05:50:50.215242 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-notification-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215256 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-notification-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: E0308 05:50:50.215277 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-central-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215283 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-central-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: E0308 05:50:50.215296 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="proxy-httpd" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215303 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="proxy-httpd" Mar 08 05:50:50 crc kubenswrapper[4717]: E0308 05:50:50.215320 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="sg-core" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215325 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="sg-core" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215484 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="proxy-httpd" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215492 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="sg-core" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215500 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-central-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.215511 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" containerName="ceilometer-notification-agent" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.218289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.224033 4717 scope.go:117] "RemoveContainer" containerID="2e61ff8627c042942b55f2641333147dc5c55f3b996cf2f9b333525379bce7aa" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.224182 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.237730 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.245043 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.256405 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.290963 4717 scope.go:117] "RemoveContainer" containerID="8daeaf0236f22615e0839234098b971ac435807b9f6a5e745b76d0129a9004a7" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.415977 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-config-data\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.416268 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-scripts\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.416289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.416436 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.416529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-log-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.416765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.417017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b5lx\" (UniqueName: \"kubernetes.io/projected/34d60e99-8898-4576-b35a-8323db25511c-kube-api-access-9b5lx\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.417084 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-run-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519301 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b5lx\" (UniqueName: \"kubernetes.io/projected/34d60e99-8898-4576-b35a-8323db25511c-kube-api-access-9b5lx\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519380 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-run-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-config-data\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519472 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-scripts\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519496 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519538 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-log-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519643 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.519977 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-run-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.520229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34d60e99-8898-4576-b35a-8323db25511c-log-httpd\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.524502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.527417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.527854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.530577 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-config-data\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.537423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b5lx\" (UniqueName: \"kubernetes.io/projected/34d60e99-8898-4576-b35a-8323db25511c-kube-api-access-9b5lx\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.539096 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d60e99-8898-4576-b35a-8323db25511c-scripts\") pod \"ceilometer-0\" (UID: \"34d60e99-8898-4576-b35a-8323db25511c\") " pod="openstack/ceilometer-0" Mar 08 05:50:50 crc kubenswrapper[4717]: I0308 05:50:50.552497 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.008848 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 05:50:51 crc kubenswrapper[4717]: W0308 05:50:51.014810 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d60e99_8898_4576_b35a_8323db25511c.slice/crio-731f76d883439ad0bceb396b77329fb02d2b168022b71d246aa4db9d7127a1cb WatchSource:0}: Error finding container 731f76d883439ad0bceb396b77329fb02d2b168022b71d246aa4db9d7127a1cb: Status 404 returned error can't find the container with id 731f76d883439ad0bceb396b77329fb02d2b168022b71d246aa4db9d7127a1cb Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.165390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerStarted","Data":"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6"} Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.165440 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerStarted","Data":"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a"} Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.167160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34d60e99-8898-4576-b35a-8323db25511c","Type":"ContainerStarted","Data":"731f76d883439ad0bceb396b77329fb02d2b168022b71d246aa4db9d7127a1cb"} Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.201929 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.201906443 podStartE2EDuration="2.201906443s" podCreationTimestamp="2026-03-08 05:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:51.197801841 +0000 UTC m=+1478.115450705" watchObservedRunningTime="2026-03-08 05:50:51.201906443 +0000 UTC m=+1478.119555297" Mar 08 05:50:51 crc kubenswrapper[4717]: I0308 05:50:51.800003 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdee03e0-31d8-44a1-86e8-92a5b356370d" path="/var/lib/kubelet/pods/cdee03e0-31d8-44a1-86e8-92a5b356370d/volumes" Mar 08 05:50:52 crc kubenswrapper[4717]: I0308 05:50:52.186277 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34d60e99-8898-4576-b35a-8323db25511c","Type":"ContainerStarted","Data":"4f6a9ed48431657e5b0ce95b4f1649809e2095bb37cd1907b60fd8bf6160ea4f"} Mar 08 05:50:52 crc kubenswrapper[4717]: I0308 05:50:52.186343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34d60e99-8898-4576-b35a-8323db25511c","Type":"ContainerStarted","Data":"ad4efc51ec05abcca3a6f80f21efbccdfbd5eb36b598a15971edc5752486384a"} Mar 08 05:50:53 crc kubenswrapper[4717]: I0308 05:50:53.196971 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34d60e99-8898-4576-b35a-8323db25511c","Type":"ContainerStarted","Data":"6ea53ef378ad9fd518d6c8e8a92a69fe726d409ab7b6a7e2885b98a1f4f5bed0"} Mar 08 05:50:53 crc kubenswrapper[4717]: I0308 05:50:53.758643 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:53 crc kubenswrapper[4717]: I0308 05:50:53.834036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.228728 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.411854 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hfkcg"] Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.413365 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.415859 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.416010 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.428863 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hfkcg"] Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.518647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh87d\" (UniqueName: \"kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.518865 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.518970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.519028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.559825 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.622935 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.623081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh87d\" (UniqueName: \"kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.623160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.623226 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.629019 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.631249 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.631583 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="dnsmasq-dns" containerID="cri-o://5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab" gracePeriod=10 Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.633275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.649301 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh87d\" (UniqueName: \"kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.658271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hfkcg\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:54 crc kubenswrapper[4717]: I0308 05:50:54.737326 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.131240 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.215402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34d60e99-8898-4576-b35a-8323db25511c","Type":"ContainerStarted","Data":"24bb37864cf9ef8f0ed3e9ea9a785da2c3538df6baff26bd8506ec30f98737e5"} Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.216976 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.220705 4717 generic.go:334] "Generic (PLEG): container finished" podID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerID="5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab" exitCode=0 Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.220978 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.221354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" event={"ID":"0a2f4071-99c2-4755-af9b-ba683a154d22","Type":"ContainerDied","Data":"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab"} Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.221438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575bd44df9-5d8wr" event={"ID":"0a2f4071-99c2-4755-af9b-ba683a154d22","Type":"ContainerDied","Data":"1ce3fdfc41d8f0eaae914cc19d1eb012b46ab97fae33602352d9ff76026eca5b"} Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.221519 4717 scope.go:117] "RemoveContainer" containerID="5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.233987 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.234041 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.234068 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.234116 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.234132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qchj7\" (UniqueName: \"kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.234211 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0\") pod \"0a2f4071-99c2-4755-af9b-ba683a154d22\" (UID: \"0a2f4071-99c2-4755-af9b-ba683a154d22\") " Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.246768 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.087903753 podStartE2EDuration="5.24674824s" podCreationTimestamp="2026-03-08 05:50:50 +0000 UTC" firstStartedPulling="2026-03-08 05:50:51.01659624 +0000 UTC m=+1477.934245074" lastFinishedPulling="2026-03-08 05:50:54.175440717 +0000 UTC m=+1481.093089561" observedRunningTime="2026-03-08 05:50:55.241340376 +0000 UTC m=+1482.158989220" watchObservedRunningTime="2026-03-08 05:50:55.24674824 +0000 UTC m=+1482.164397084" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.278985 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7" (OuterVolumeSpecName: "kube-api-access-qchj7") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "kube-api-access-qchj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.292576 4717 scope.go:117] "RemoveContainer" containerID="1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.314783 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.315999 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.334181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config" (OuterVolumeSpecName: "config") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.337436 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.340442 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.341354 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.341439 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qchj7\" (UniqueName: \"kubernetes.io/projected/0a2f4071-99c2-4755-af9b-ba683a154d22-kube-api-access-qchj7\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.337428 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.342177 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hfkcg"] Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.344637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a2f4071-99c2-4755-af9b-ba683a154d22" (UID: "0a2f4071-99c2-4755-af9b-ba683a154d22"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:50:55 crc kubenswrapper[4717]: W0308 05:50:55.346826 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d5a9dec_59bb_4422_b37d_f0fe159f82d4.slice/crio-5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc WatchSource:0}: Error finding container 5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc: Status 404 returned error can't find the container with id 5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.442823 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.442854 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a2f4071-99c2-4755-af9b-ba683a154d22-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.464506 4717 scope.go:117] "RemoveContainer" containerID="5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab" Mar 08 05:50:55 crc kubenswrapper[4717]: E0308 05:50:55.464949 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab\": container with ID starting with 5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab not found: ID does not exist" containerID="5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.464980 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab"} err="failed to get container status \"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab\": rpc error: code = NotFound desc = could not find container \"5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab\": container with ID starting with 5aa9bf95d783f590c64147f08ea97923b73e5556bc058bf05ac7210ae9d116ab not found: ID does not exist" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.465003 4717 scope.go:117] "RemoveContainer" containerID="1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5" Mar 08 05:50:55 crc kubenswrapper[4717]: E0308 05:50:55.465342 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5\": container with ID starting with 1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5 not found: ID does not exist" containerID="1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.465368 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5"} err="failed to get container status \"1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5\": rpc error: code = NotFound desc = could not find container \"1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5\": container with ID starting with 1ea0d42ab68328a718c4a8b65aec01047806ee2f3e6c5720e334433c04e82cc5 not found: ID does not exist" Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.548279 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.556955 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-575bd44df9-5d8wr"] Mar 08 05:50:55 crc kubenswrapper[4717]: I0308 05:50:55.794383 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" path="/var/lib/kubelet/pods/0a2f4071-99c2-4755-af9b-ba683a154d22/volumes" Mar 08 05:50:56 crc kubenswrapper[4717]: I0308 05:50:56.262545 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hfkcg" event={"ID":"8d5a9dec-59bb-4422-b37d-f0fe159f82d4","Type":"ContainerStarted","Data":"4d607c4bdb313b66bd7a7fc1793f91f8ed731f74c7918eca1bf97c4e2e7278c7"} Mar 08 05:50:56 crc kubenswrapper[4717]: I0308 05:50:56.262677 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hfkcg" event={"ID":"8d5a9dec-59bb-4422-b37d-f0fe159f82d4","Type":"ContainerStarted","Data":"5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc"} Mar 08 05:50:56 crc kubenswrapper[4717]: I0308 05:50:56.287416 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hfkcg" podStartSLOduration=2.287393954 podStartE2EDuration="2.287393954s" podCreationTimestamp="2026-03-08 05:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:50:56.282438321 +0000 UTC m=+1483.200087175" watchObservedRunningTime="2026-03-08 05:50:56.287393954 +0000 UTC m=+1483.205042798" Mar 08 05:50:59 crc kubenswrapper[4717]: I0308 05:50:59.531430 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:50:59 crc kubenswrapper[4717]: I0308 05:50:59.532255 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:51:00 crc kubenswrapper[4717]: I0308 05:51:00.315406 4717 generic.go:334] "Generic (PLEG): container finished" podID="8d5a9dec-59bb-4422-b37d-f0fe159f82d4" containerID="4d607c4bdb313b66bd7a7fc1793f91f8ed731f74c7918eca1bf97c4e2e7278c7" exitCode=0 Mar 08 05:51:00 crc kubenswrapper[4717]: I0308 05:51:00.315661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hfkcg" event={"ID":"8d5a9dec-59bb-4422-b37d-f0fe159f82d4","Type":"ContainerDied","Data":"4d607c4bdb313b66bd7a7fc1793f91f8ed731f74c7918eca1bf97c4e2e7278c7"} Mar 08 05:51:00 crc kubenswrapper[4717]: I0308 05:51:00.547926 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:00 crc kubenswrapper[4717]: I0308 05:51:00.548354 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.832018 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.867775 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh87d\" (UniqueName: \"kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d\") pod \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.867929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts\") pod \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.867966 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data\") pod \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.868057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle\") pod \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\" (UID: \"8d5a9dec-59bb-4422-b37d-f0fe159f82d4\") " Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.878958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d" (OuterVolumeSpecName: "kube-api-access-nh87d") pod "8d5a9dec-59bb-4422-b37d-f0fe159f82d4" (UID: "8d5a9dec-59bb-4422-b37d-f0fe159f82d4"). InnerVolumeSpecName "kube-api-access-nh87d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.884599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts" (OuterVolumeSpecName: "scripts") pod "8d5a9dec-59bb-4422-b37d-f0fe159f82d4" (UID: "8d5a9dec-59bb-4422-b37d-f0fe159f82d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.897296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d5a9dec-59bb-4422-b37d-f0fe159f82d4" (UID: "8d5a9dec-59bb-4422-b37d-f0fe159f82d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.898559 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data" (OuterVolumeSpecName: "config-data") pod "8d5a9dec-59bb-4422-b37d-f0fe159f82d4" (UID: "8d5a9dec-59bb-4422-b37d-f0fe159f82d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.969602 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh87d\" (UniqueName: \"kubernetes.io/projected/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-kube-api-access-nh87d\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.969636 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.969645 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:01 crc kubenswrapper[4717]: I0308 05:51:01.969655 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d5a9dec-59bb-4422-b37d-f0fe159f82d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.339376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hfkcg" event={"ID":"8d5a9dec-59bb-4422-b37d-f0fe159f82d4","Type":"ContainerDied","Data":"5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc"} Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.339425 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5abdca67a0c2fbe27ae94a7d14f3c8cf656927e7bbe76d096759c25ad2e0f0dc" Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.339493 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hfkcg" Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.561856 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.562182 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-log" containerID="cri-o://013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a" gracePeriod=30 Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.562409 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-api" containerID="cri-o://668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6" gracePeriod=30 Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.585405 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.586513 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="42fad031-16bc-4641-9f97-3b89351d0b89" containerName="nova-scheduler-scheduler" containerID="cri-o://a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92" gracePeriod=30 Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.602121 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.607291 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-log" containerID="cri-o://fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96" gracePeriod=30 Mar 08 05:51:02 crc kubenswrapper[4717]: I0308 05:51:02.607957 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-metadata" containerID="cri-o://c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889" gracePeriod=30 Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.349401 4717 generic.go:334] "Generic (PLEG): container finished" podID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerID="fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96" exitCode=143 Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.349471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerDied","Data":"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96"} Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.352252 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerID="013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a" exitCode=143 Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.352289 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerDied","Data":"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a"} Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.823567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.926533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhf48\" (UniqueName: \"kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48\") pod \"42fad031-16bc-4641-9f97-3b89351d0b89\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.926715 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle\") pod \"42fad031-16bc-4641-9f97-3b89351d0b89\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.926978 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data\") pod \"42fad031-16bc-4641-9f97-3b89351d0b89\" (UID: \"42fad031-16bc-4641-9f97-3b89351d0b89\") " Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.939845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48" (OuterVolumeSpecName: "kube-api-access-nhf48") pod "42fad031-16bc-4641-9f97-3b89351d0b89" (UID: "42fad031-16bc-4641-9f97-3b89351d0b89"). InnerVolumeSpecName "kube-api-access-nhf48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.964403 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data" (OuterVolumeSpecName: "config-data") pod "42fad031-16bc-4641-9f97-3b89351d0b89" (UID: "42fad031-16bc-4641-9f97-3b89351d0b89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:03 crc kubenswrapper[4717]: I0308 05:51:03.969841 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42fad031-16bc-4641-9f97-3b89351d0b89" (UID: "42fad031-16bc-4641-9f97-3b89351d0b89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.029418 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.029453 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42fad031-16bc-4641-9f97-3b89351d0b89-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.029468 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhf48\" (UniqueName: \"kubernetes.io/projected/42fad031-16bc-4641-9f97-3b89351d0b89-kube-api-access-nhf48\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.084491 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.088173 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134469 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs\") pod \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134530 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data\") pod \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134561 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs\") pod \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnqdr\" (UniqueName: \"kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr\") pod \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134762 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134822 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.134945 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle\") pod \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\" (UID: \"9d43b16c-5bb0-4724-8df1-2b83168b22ce\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.135003 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.135091 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs\") pod \"0e283047-d69f-4aa8-9ad0-cd8309013594\" (UID: \"0e283047-d69f-4aa8-9ad0-cd8309013594\") " Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.135646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs" (OuterVolumeSpecName: "logs") pod "9d43b16c-5bb0-4724-8df1-2b83168b22ce" (UID: "9d43b16c-5bb0-4724-8df1-2b83168b22ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.136368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs" (OuterVolumeSpecName: "logs") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.142083 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr" (OuterVolumeSpecName: "kube-api-access-qnqdr") pod "9d43b16c-5bb0-4724-8df1-2b83168b22ce" (UID: "9d43b16c-5bb0-4724-8df1-2b83168b22ce"). InnerVolumeSpecName "kube-api-access-qnqdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.144517 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc" (OuterVolumeSpecName: "kube-api-access-z4bvc") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "kube-api-access-z4bvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.181697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data" (OuterVolumeSpecName: "config-data") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.186122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.206150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data" (OuterVolumeSpecName: "config-data") pod "9d43b16c-5bb0-4724-8df1-2b83168b22ce" (UID: "9d43b16c-5bb0-4724-8df1-2b83168b22ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.207218 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d43b16c-5bb0-4724-8df1-2b83168b22ce" (UID: "9d43b16c-5bb0-4724-8df1-2b83168b22ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.207380 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.216912 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d43b16c-5bb0-4724-8df1-2b83168b22ce" (UID: "9d43b16c-5bb0-4724-8df1-2b83168b22ce"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.237941 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.237975 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.237985 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e283047-d69f-4aa8-9ad0-cd8309013594-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.237994 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238005 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d43b16c-5bb0-4724-8df1-2b83168b22ce-logs\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238013 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d43b16c-5bb0-4724-8df1-2b83168b22ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238021 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238029 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnqdr\" (UniqueName: \"kubernetes.io/projected/9d43b16c-5bb0-4724-8df1-2b83168b22ce-kube-api-access-qnqdr\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238037 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/0e283047-d69f-4aa8-9ad0-cd8309013594-kube-api-access-z4bvc\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.238045 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.241354 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e283047-d69f-4aa8-9ad0-cd8309013594" (UID: "0e283047-d69f-4aa8-9ad0-cd8309013594"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.339119 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e283047-d69f-4aa8-9ad0-cd8309013594-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.363284 4717 generic.go:334] "Generic (PLEG): container finished" podID="42fad031-16bc-4641-9f97-3b89351d0b89" containerID="a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92" exitCode=0 Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.363361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42fad031-16bc-4641-9f97-3b89351d0b89","Type":"ContainerDied","Data":"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.363388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42fad031-16bc-4641-9f97-3b89351d0b89","Type":"ContainerDied","Data":"accbe5aadc8e7f569d48d142f83cad12921b6db818eebd64dc8e31eb0ef57158"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.363393 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.363405 4717 scope.go:117] "RemoveContainer" containerID="a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.365505 4717 generic.go:334] "Generic (PLEG): container finished" podID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerID="c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889" exitCode=0 Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.365541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerDied","Data":"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.365555 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d43b16c-5bb0-4724-8df1-2b83168b22ce","Type":"ContainerDied","Data":"aa84f1020c7b31622ad4f9eb958c8164cfd8ea66338a54a6074925eb4f8c7c3e"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.365606 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.373625 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerID="668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6" exitCode=0 Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.373672 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerDied","Data":"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.373718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e283047-d69f-4aa8-9ad0-cd8309013594","Type":"ContainerDied","Data":"9d58a7d4d0ca1bf15b57d88ce72da85be4568c303195e617f40f23def4ae53be"} Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.373788 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.386352 4717 scope.go:117] "RemoveContainer" containerID="a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.386882 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92\": container with ID starting with a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92 not found: ID does not exist" containerID="a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.386932 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92"} err="failed to get container status \"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92\": rpc error: code = NotFound desc = could not find container \"a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92\": container with ID starting with a497b117ba957a4e79b2484b603dbab4725cf022a3341cc8fa9737b86a416a92 not found: ID does not exist" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.386969 4717 scope.go:117] "RemoveContainer" containerID="c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.411554 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.427539 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.436463 4717 scope.go:117] "RemoveContainer" containerID="fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.438829 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.465061 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.470859 4717 scope.go:117] "RemoveContainer" containerID="c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.474794 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889\": container with ID starting with c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889 not found: ID does not exist" containerID="c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.474824 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889"} err="failed to get container status \"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889\": rpc error: code = NotFound desc = could not find container \"c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889\": container with ID starting with c06cc1bf0c8bca151e3ee52fe665c482a60227da926c158aeb65a8ec52662889 not found: ID does not exist" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.474844 4717 scope.go:117] "RemoveContainer" containerID="fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.478756 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96\": container with ID starting with fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96 not found: ID does not exist" containerID="fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.478782 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96"} err="failed to get container status \"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96\": rpc error: code = NotFound desc = could not find container \"fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96\": container with ID starting with fe902bbd074c7c3e7b230974a411249bcc198b1cced2514e7036ff7b23a7be96 not found: ID does not exist" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.478798 4717 scope.go:117] "RemoveContainer" containerID="668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.481607 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482102 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-log" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482120 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-log" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482139 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fad031-16bc-4641-9f97-3b89351d0b89" containerName="nova-scheduler-scheduler" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482146 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fad031-16bc-4641-9f97-3b89351d0b89" containerName="nova-scheduler-scheduler" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482174 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-metadata" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482183 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-metadata" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482196 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-api" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482203 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-api" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482219 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="init" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482226 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="init" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482240 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="dnsmasq-dns" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482247 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="dnsmasq-dns" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482265 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-log" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482273 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-log" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.482286 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5a9dec-59bb-4422-b37d-f0fe159f82d4" containerName="nova-manage" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482294 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5a9dec-59bb-4422-b37d-f0fe159f82d4" containerName="nova-manage" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482519 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-log" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482540 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5a9dec-59bb-4422-b37d-f0fe159f82d4" containerName="nova-manage" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482556 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-log" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482569 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" containerName="nova-metadata-metadata" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482576 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" containerName="nova-api-api" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482588 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2f4071-99c2-4755-af9b-ba683a154d22" containerName="dnsmasq-dns" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.482600 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fad031-16bc-4641-9f97-3b89351d0b89" containerName="nova-scheduler-scheduler" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.483408 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.491319 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.501090 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.509901 4717 scope.go:117] "RemoveContainer" containerID="013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.515239 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.524501 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.531742 4717 scope.go:117] "RemoveContainer" containerID="668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.532119 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6\": container with ID starting with 668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6 not found: ID does not exist" containerID="668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.532150 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6"} err="failed to get container status \"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6\": rpc error: code = NotFound desc = could not find container \"668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6\": container with ID starting with 668ac4aaa8d737c841264b990f50f818dfa3c08acbcd4813f00b6f2f0cfe53d6 not found: ID does not exist" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.532167 4717 scope.go:117] "RemoveContainer" containerID="013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a" Mar 08 05:51:04 crc kubenswrapper[4717]: E0308 05:51:04.532954 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a\": container with ID starting with 013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a not found: ID does not exist" containerID="013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.532978 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a"} err="failed to get container status \"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a\": rpc error: code = NotFound desc = could not find container \"013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a\": container with ID starting with 013aa9dae7b9a1310c462fc2df935bed47365c7574854fe56427625b1765c96a not found: ID does not exist" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.536045 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.537632 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.539799 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.540094 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.542584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-config-data\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.542634 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.542739 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcgzk\" (UniqueName: \"kubernetes.io/projected/fea1dcce-25a3-4f13-960a-5b08bf49e521-kube-api-access-vcgzk\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.547373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.563201 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.565569 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.567323 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.568144 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.568463 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.573495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.646880 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8mt\" (UniqueName: \"kubernetes.io/projected/2a5123f2-ce36-401f-90d0-885684623a99-kube-api-access-5m8mt\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.646949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.646992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdlgx\" (UniqueName: \"kubernetes.io/projected/5315a88e-0b28-4ad7-bc83-278711f6fb29-kube-api-access-xdlgx\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5315a88e-0b28-4ad7-bc83-278711f6fb29-logs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647146 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-config-data\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647230 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647340 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-config-data\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647543 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-config-data\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647608 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcgzk\" (UniqueName: \"kubernetes.io/projected/fea1dcce-25a3-4f13-960a-5b08bf49e521-kube-api-access-vcgzk\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.647629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a5123f2-ce36-401f-90d0-885684623a99-logs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.654015 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.654024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fea1dcce-25a3-4f13-960a-5b08bf49e521-config-data\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.678032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcgzk\" (UniqueName: \"kubernetes.io/projected/fea1dcce-25a3-4f13-960a-5b08bf49e521-kube-api-access-vcgzk\") pod \"nova-scheduler-0\" (UID: \"fea1dcce-25a3-4f13-960a-5b08bf49e521\") " pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.749870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8mt\" (UniqueName: \"kubernetes.io/projected/2a5123f2-ce36-401f-90d0-885684623a99-kube-api-access-5m8mt\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.749927 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.749964 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdlgx\" (UniqueName: \"kubernetes.io/projected/5315a88e-0b28-4ad7-bc83-278711f6fb29-kube-api-access-xdlgx\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.749998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5315a88e-0b28-4ad7-bc83-278711f6fb29-logs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750103 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750129 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-config-data\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-config-data\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750281 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a5123f2-ce36-401f-90d0-885684623a99-logs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.750802 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a5123f2-ce36-401f-90d0-885684623a99-logs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.752291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5315a88e-0b28-4ad7-bc83-278711f6fb29-logs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.764585 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.764795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.769414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-config-data\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.769750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.770337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-public-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.770515 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5315a88e-0b28-4ad7-bc83-278711f6fb29-config-data\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.776371 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8mt\" (UniqueName: \"kubernetes.io/projected/2a5123f2-ce36-401f-90d0-885684623a99-kube-api-access-5m8mt\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.776886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a5123f2-ce36-401f-90d0-885684623a99-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2a5123f2-ce36-401f-90d0-885684623a99\") " pod="openstack/nova-api-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.792289 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdlgx\" (UniqueName: \"kubernetes.io/projected/5315a88e-0b28-4ad7-bc83-278711f6fb29-kube-api-access-xdlgx\") pod \"nova-metadata-0\" (UID: \"5315a88e-0b28-4ad7-bc83-278711f6fb29\") " pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.811199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.863190 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 05:51:04 crc kubenswrapper[4717]: I0308 05:51:04.903668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.295774 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.421307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fea1dcce-25a3-4f13-960a-5b08bf49e521","Type":"ContainerStarted","Data":"d3fca06784e43f3205a408ae2341e86b23d8eec27b76ac42d7313b5b4d57a9f7"} Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.447038 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.459762 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.795344 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e283047-d69f-4aa8-9ad0-cd8309013594" path="/var/lib/kubelet/pods/0e283047-d69f-4aa8-9ad0-cd8309013594/volumes" Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.797676 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fad031-16bc-4641-9f97-3b89351d0b89" path="/var/lib/kubelet/pods/42fad031-16bc-4641-9f97-3b89351d0b89/volumes" Mar 08 05:51:05 crc kubenswrapper[4717]: I0308 05:51:05.798447 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d43b16c-5bb0-4724-8df1-2b83168b22ce" path="/var/lib/kubelet/pods/9d43b16c-5bb0-4724-8df1-2b83168b22ce/volumes" Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.438508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fea1dcce-25a3-4f13-960a-5b08bf49e521","Type":"ContainerStarted","Data":"873e20e5580c333d014998d51356936fd5874899c0237e719cee0d30a2f16f0d"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.442675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5315a88e-0b28-4ad7-bc83-278711f6fb29","Type":"ContainerStarted","Data":"4333a96c2e3a03af3a457dcbb39580595cb56ce963c166bba8a26be8366cd2f0"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.442735 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5315a88e-0b28-4ad7-bc83-278711f6fb29","Type":"ContainerStarted","Data":"3968cc149c24586a526507d0b4dfc32f2142aab5a66b855905552f0642fe815b"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.442748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5315a88e-0b28-4ad7-bc83-278711f6fb29","Type":"ContainerStarted","Data":"11f4f2510c3fe5375f19ece05d5df276b01bfec3c5c43c00b3497eacd8e5e4be"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.449739 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a5123f2-ce36-401f-90d0-885684623a99","Type":"ContainerStarted","Data":"76404adbcf53fcd69171d640228855421ca76f619b4c081f261bed44ab888ffb"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.449779 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a5123f2-ce36-401f-90d0-885684623a99","Type":"ContainerStarted","Data":"b4710ebbc106d9948204e7ba0e230fbda7b1682a022de903ae973a1ec3b59f03"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.449789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2a5123f2-ce36-401f-90d0-885684623a99","Type":"ContainerStarted","Data":"897311f0b1c76afdf928f8e55abb6084bfb64e7af47d1ca313b7b7842e5d250a"} Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.477757 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.477732917 podStartE2EDuration="2.477732917s" podCreationTimestamp="2026-03-08 05:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:51:06.463282809 +0000 UTC m=+1493.380931663" watchObservedRunningTime="2026-03-08 05:51:06.477732917 +0000 UTC m=+1493.395381771" Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.500850 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.500828239 podStartE2EDuration="2.500828239s" podCreationTimestamp="2026-03-08 05:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:51:06.488598686 +0000 UTC m=+1493.406247540" watchObservedRunningTime="2026-03-08 05:51:06.500828239 +0000 UTC m=+1493.418477093" Mar 08 05:51:06 crc kubenswrapper[4717]: I0308 05:51:06.521282 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.521266284 podStartE2EDuration="2.521266284s" podCreationTimestamp="2026-03-08 05:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:51:06.521033989 +0000 UTC m=+1493.438682843" watchObservedRunningTime="2026-03-08 05:51:06.521266284 +0000 UTC m=+1493.438915128" Mar 08 05:51:09 crc kubenswrapper[4717]: I0308 05:51:09.812373 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 05:51:09 crc kubenswrapper[4717]: I0308 05:51:09.863399 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:51:09 crc kubenswrapper[4717]: I0308 05:51:09.863536 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.812043 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.859337 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.863528 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.863751 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.904533 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:51:14 crc kubenswrapper[4717]: I0308 05:51:14.905013 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 05:51:15 crc kubenswrapper[4717]: I0308 05:51:15.605728 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 05:51:15 crc kubenswrapper[4717]: I0308 05:51:15.880974 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5315a88e-0b28-4ad7-bc83-278711f6fb29" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:15 crc kubenswrapper[4717]: I0308 05:51:15.880992 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5315a88e-0b28-4ad7-bc83-278711f6fb29" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:15 crc kubenswrapper[4717]: I0308 05:51:15.930907 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a5123f2-ce36-401f-90d0-885684623a99" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:15 crc kubenswrapper[4717]: I0308 05:51:15.931407 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2a5123f2-ce36-401f-90d0-885684623a99" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 05:51:20 crc kubenswrapper[4717]: I0308 05:51:20.575248 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.451217 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.454435 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.482030 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.584163 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.584678 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.584945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc5x\" (UniqueName: \"kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.686712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc5x\" (UniqueName: \"kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.686842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.686914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.687398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.687448 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.715961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc5x\" (UniqueName: \"kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x\") pod \"redhat-operators-tkcqh\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.782266 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.869624 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.884827 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.895299 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.921924 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.922952 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.941966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 05:51:24 crc kubenswrapper[4717]: I0308 05:51:24.948026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.269764 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.677604 4717 generic.go:334] "Generic (PLEG): container finished" podID="919752ba-56da-468a-af72-eced98208052" containerID="2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b" exitCode=0 Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.677723 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerDied","Data":"2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b"} Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.677778 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerStarted","Data":"f81634c32bdddd56203437ec7495483b1e07d01bc6d52a5d464b1426430e18c8"} Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.678117 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.679830 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.695851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 05:51:25 crc kubenswrapper[4717]: I0308 05:51:25.696452 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 05:51:26 crc kubenswrapper[4717]: I0308 05:51:26.701957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerStarted","Data":"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3"} Mar 08 05:51:28 crc kubenswrapper[4717]: I0308 05:51:28.757106 4717 generic.go:334] "Generic (PLEG): container finished" podID="919752ba-56da-468a-af72-eced98208052" containerID="d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3" exitCode=0 Mar 08 05:51:28 crc kubenswrapper[4717]: I0308 05:51:28.757225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerDied","Data":"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3"} Mar 08 05:51:30 crc kubenswrapper[4717]: I0308 05:51:30.783488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerStarted","Data":"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81"} Mar 08 05:51:30 crc kubenswrapper[4717]: I0308 05:51:30.816738 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tkcqh" podStartSLOduration=2.722468247 podStartE2EDuration="6.816717655s" podCreationTimestamp="2026-03-08 05:51:24 +0000 UTC" firstStartedPulling="2026-03-08 05:51:25.67953824 +0000 UTC m=+1512.597187094" lastFinishedPulling="2026-03-08 05:51:29.773787628 +0000 UTC m=+1516.691436502" observedRunningTime="2026-03-08 05:51:30.806226715 +0000 UTC m=+1517.723875569" watchObservedRunningTime="2026-03-08 05:51:30.816717655 +0000 UTC m=+1517.734366499" Mar 08 05:51:33 crc kubenswrapper[4717]: I0308 05:51:33.907197 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:34 crc kubenswrapper[4717]: I0308 05:51:34.782545 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:34 crc kubenswrapper[4717]: I0308 05:51:34.782915 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:34 crc kubenswrapper[4717]: I0308 05:51:34.862506 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:35 crc kubenswrapper[4717]: I0308 05:51:35.841833 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tkcqh" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" probeResult="failure" output=< Mar 08 05:51:35 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:51:35 crc kubenswrapper[4717]: > Mar 08 05:51:37 crc kubenswrapper[4717]: I0308 05:51:37.300868 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="rabbitmq" containerID="cri-o://c4fe89ad42bcf643e8f2825838c44b2ea73d780d43a4d90b6ca83118e44bcafe" gracePeriod=604797 Mar 08 05:51:38 crc kubenswrapper[4717]: I0308 05:51:38.170292 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="rabbitmq" containerID="cri-o://5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5" gracePeriod=604797 Mar 08 05:51:38 crc kubenswrapper[4717]: I0308 05:51:38.892456 4717 generic.go:334] "Generic (PLEG): container finished" podID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerID="c4fe89ad42bcf643e8f2825838c44b2ea73d780d43a4d90b6ca83118e44bcafe" exitCode=0 Mar 08 05:51:38 crc kubenswrapper[4717]: I0308 05:51:38.892783 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerDied","Data":"c4fe89ad42bcf643e8f2825838c44b2ea73d780d43a4d90b6ca83118e44bcafe"} Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.033075 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.126350 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.126457 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127358 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127382 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127509 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127672 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127726 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127758 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127891 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.127970 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4km\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km\") pod \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\" (UID: \"7ce570a4-b883-4b07-a4a2-e5e820ab538c\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.129983 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.130312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.130641 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.137064 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.137626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.138904 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km" (OuterVolumeSpecName: "kube-api-access-hd4km") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "kube-api-access-hd4km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.165868 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info" (OuterVolumeSpecName: "pod-info") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.177655 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.193215 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data" (OuterVolumeSpecName: "config-data") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.215169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf" (OuterVolumeSpecName: "server-conf") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231153 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231197 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231208 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231217 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231226 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ce570a4-b883-4b07-a4a2-e5e820ab538c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231235 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231242 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ce570a4-b883-4b07-a4a2-e5e820ab538c-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231252 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231261 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ce570a4-b883-4b07-a4a2-e5e820ab538c-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.231269 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4km\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-kube-api-access-hd4km\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.272809 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.290355 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7ce570a4-b883-4b07-a4a2-e5e820ab538c" (UID: "7ce570a4-b883-4b07-a4a2-e5e820ab538c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.333534 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.333563 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ce570a4-b883-4b07-a4a2-e5e820ab538c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.716992 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844405 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844483 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844530 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844619 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844705 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844725 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844773 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844796 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844816 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.844833 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssdq2\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2\") pod \"d4a94056-9d2f-45ef-afa3-cf858787fc87\" (UID: \"d4a94056-9d2f-45ef-afa3-cf858787fc87\") " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.846318 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.846490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.847088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.851016 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.851284 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.877225 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2" (OuterVolumeSpecName: "kube-api-access-ssdq2") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "kube-api-access-ssdq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.877365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info" (OuterVolumeSpecName: "pod-info") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.877710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.881306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data" (OuterVolumeSpecName: "config-data") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.906105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ce570a4-b883-4b07-a4a2-e5e820ab538c","Type":"ContainerDied","Data":"6e1d70b1123f34f8f3389305426f7651bafa6aca941cf020a0f2f6259c7a8cb4"} Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.906152 4717 scope.go:117] "RemoveContainer" containerID="c4fe89ad42bcf643e8f2825838c44b2ea73d780d43a4d90b6ca83118e44bcafe" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.906187 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.910813 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerID="5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5" exitCode=0 Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.911152 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerDied","Data":"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5"} Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.911175 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d4a94056-9d2f-45ef-afa3-cf858787fc87","Type":"ContainerDied","Data":"d32b6f3314933d1837a2bc28529f090a98ed82bc3e777da9377897b313f4722a"} Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.911317 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951744 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951771 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951780 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951799 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951807 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssdq2\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-kube-api-access-ssdq2\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951816 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4a94056-9d2f-45ef-afa3-cf858787fc87-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951826 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951835 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.951843 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4a94056-9d2f-45ef-afa3-cf858787fc87-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.952457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf" (OuterVolumeSpecName: "server-conf") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.976255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d4a94056-9d2f-45ef-afa3-cf858787fc87" (UID: "d4a94056-9d2f-45ef-afa3-cf858787fc87"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:39 crc kubenswrapper[4717]: I0308 05:51:39.976392 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.003091 4717 scope.go:117] "RemoveContainer" containerID="ae723f586b191c6db521638d50b64a0c103a18f55e7dfb6f19046bc98fd39696" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.012885 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.028845 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.031540 4717 scope.go:117] "RemoveContainer" containerID="5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.053783 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054108 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4a94056-9d2f-45ef-afa3-cf858787fc87-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054138 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4a94056-9d2f-45ef-afa3-cf858787fc87-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054148 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.054215 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="setup-container" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054228 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="setup-container" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.054264 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="setup-container" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054271 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="setup-container" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.054281 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054288 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.054298 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054304 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054494 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.054508 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" containerName="rabbitmq" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.055516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.059315 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.059388 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.059540 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.059582 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.059547 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.062335 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-949rb" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.062651 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.064270 4717 scope.go:117] "RemoveContainer" containerID="77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.075373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.100370 4717 scope.go:117] "RemoveContainer" containerID="5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.100827 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5\": container with ID starting with 5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5 not found: ID does not exist" containerID="5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.100863 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5"} err="failed to get container status \"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5\": rpc error: code = NotFound desc = could not find container \"5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5\": container with ID starting with 5eae1c5ca82ae4cc5454ddd5c6bdaf777dcb82efb5bac89da3335d4ad397c1a5 not found: ID does not exist" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.100883 4717 scope.go:117] "RemoveContainer" containerID="77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e" Mar 08 05:51:40 crc kubenswrapper[4717]: E0308 05:51:40.101321 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e\": container with ID starting with 77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e not found: ID does not exist" containerID="77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.101349 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e"} err="failed to get container status \"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e\": rpc error: code = NotFound desc = could not find container \"77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e\": container with ID starting with 77920dc17660de44ad3c9c4d7e5dd1cc6e59d3d063ce50a42124135bac0be95e not found: ID does not exist" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.155926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156002 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156032 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156084 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156138 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156159 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lng4\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-kube-api-access-9lng4\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156197 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156257 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.156310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.249962 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258445 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258564 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258719 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258775 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258797 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lng4\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-kube-api-access-9lng4\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.258859 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.259023 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.259558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.259675 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.259692 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.259643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.260251 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.262601 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.263602 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.273420 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.273532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.274418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.282203 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.282477 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lng4\" (UniqueName: \"kubernetes.io/projected/c1331e99-d131-4f8c-ae4e-6217cf54ddaf-kube-api-access-9lng4\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.284332 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.287952 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.288140 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.288340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lqqnh" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.288539 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.288648 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.288759 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.289044 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.293577 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.336409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"c1331e99-d131-4f8c-ae4e-6217cf54ddaf\") " pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361562 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361595 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgnz\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-kube-api-access-vzgnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361854 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.361890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.374774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.463775 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.463844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464537 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgnz\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-kube-api-access-vzgnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464584 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464695 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.464798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.465526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.465980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.466152 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.467970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.468004 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.468262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.468616 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.471823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.477234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.492779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.498754 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgnz\" (UniqueName: \"kubernetes.io/projected/4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7-kube-api-access-vzgnz\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.502471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.599390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.855348 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 05:51:40 crc kubenswrapper[4717]: I0308 05:51:40.956170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1331e99-d131-4f8c-ae4e-6217cf54ddaf","Type":"ContainerStarted","Data":"40900d922507b5934cd025cbff55960bcf09aea061d54add8df11212d1ee39fa"} Mar 08 05:51:41 crc kubenswrapper[4717]: W0308 05:51:41.053360 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7bf3f6_cc6f_4e57_9b1e_3b1b54c0a1f7.slice/crio-5aa6e3a70052e7a982933df5ff1e86a50e7c6d9fc08d9414421d7ef167633110 WatchSource:0}: Error finding container 5aa6e3a70052e7a982933df5ff1e86a50e7c6d9fc08d9414421d7ef167633110: Status 404 returned error can't find the container with id 5aa6e3a70052e7a982933df5ff1e86a50e7c6d9fc08d9414421d7ef167633110 Mar 08 05:51:41 crc kubenswrapper[4717]: I0308 05:51:41.054219 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 05:51:41 crc kubenswrapper[4717]: I0308 05:51:41.814406 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce570a4-b883-4b07-a4a2-e5e820ab538c" path="/var/lib/kubelet/pods/7ce570a4-b883-4b07-a4a2-e5e820ab538c/volumes" Mar 08 05:51:41 crc kubenswrapper[4717]: I0308 05:51:41.817309 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a94056-9d2f-45ef-afa3-cf858787fc87" path="/var/lib/kubelet/pods/d4a94056-9d2f-45ef-afa3-cf858787fc87/volumes" Mar 08 05:51:41 crc kubenswrapper[4717]: I0308 05:51:41.974383 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7","Type":"ContainerStarted","Data":"5aa6e3a70052e7a982933df5ff1e86a50e7c6d9fc08d9414421d7ef167633110"} Mar 08 05:51:42 crc kubenswrapper[4717]: I0308 05:51:42.988371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1331e99-d131-4f8c-ae4e-6217cf54ddaf","Type":"ContainerStarted","Data":"84b2ed6169558aac6b0fd1d92039e0f28181e491d33a147160d69ab424e88b10"} Mar 08 05:51:42 crc kubenswrapper[4717]: I0308 05:51:42.993317 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7","Type":"ContainerStarted","Data":"3a0caf0c1ba783f6a3d4b40d25ddc69a2138a7db7ce8238543a8ab5927d88a44"} Mar 08 05:51:45 crc kubenswrapper[4717]: I0308 05:51:45.854072 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tkcqh" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" probeResult="failure" output=< Mar 08 05:51:45 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 05:51:45 crc kubenswrapper[4717]: > Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.381569 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.383943 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.388082 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.401885 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421126 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421340 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwjp\" (UniqueName: \"kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.421454 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523671 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523730 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523804 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.523928 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwjp\" (UniqueName: \"kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.524382 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.524874 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.524931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.524938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.524944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.525126 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.525575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.546727 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwjp\" (UniqueName: \"kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp\") pod \"dnsmasq-dns-546cc8c9-nzj7h\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:47 crc kubenswrapper[4717]: I0308 05:51:47.723776 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:48 crc kubenswrapper[4717]: I0308 05:51:48.190107 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:51:49 crc kubenswrapper[4717]: I0308 05:51:49.082334 4717 generic.go:334] "Generic (PLEG): container finished" podID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerID="a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2" exitCode=0 Mar 08 05:51:49 crc kubenswrapper[4717]: I0308 05:51:49.082642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" event={"ID":"45fd46fc-ae39-4ce2-8489-1da3e3178f2f","Type":"ContainerDied","Data":"a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2"} Mar 08 05:51:49 crc kubenswrapper[4717]: I0308 05:51:49.082671 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" event={"ID":"45fd46fc-ae39-4ce2-8489-1da3e3178f2f","Type":"ContainerStarted","Data":"528c293fa1fc295ac6ca4f5d4fc4153c8d6aafa42990c39f4a32eca3b6c335bd"} Mar 08 05:51:50 crc kubenswrapper[4717]: I0308 05:51:50.094162 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" event={"ID":"45fd46fc-ae39-4ce2-8489-1da3e3178f2f","Type":"ContainerStarted","Data":"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88"} Mar 08 05:51:50 crc kubenswrapper[4717]: I0308 05:51:50.095168 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:50 crc kubenswrapper[4717]: I0308 05:51:50.117547 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" podStartSLOduration=3.117524508 podStartE2EDuration="3.117524508s" podCreationTimestamp="2026-03-08 05:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:51:50.115725734 +0000 UTC m=+1537.033374598" watchObservedRunningTime="2026-03-08 05:51:50.117524508 +0000 UTC m=+1537.035173362" Mar 08 05:51:54 crc kubenswrapper[4717]: I0308 05:51:54.855701 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:54 crc kubenswrapper[4717]: I0308 05:51:54.920084 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:55 crc kubenswrapper[4717]: I0308 05:51:55.661421 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.174489 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tkcqh" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" containerID="cri-o://1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81" gracePeriod=2 Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.691509 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.846952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content\") pod \"919752ba-56da-468a-af72-eced98208052\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.866102 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkc5x\" (UniqueName: \"kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x\") pod \"919752ba-56da-468a-af72-eced98208052\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.866180 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities\") pod \"919752ba-56da-468a-af72-eced98208052\" (UID: \"919752ba-56da-468a-af72-eced98208052\") " Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.866852 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities" (OuterVolumeSpecName: "utilities") pod "919752ba-56da-468a-af72-eced98208052" (UID: "919752ba-56da-468a-af72-eced98208052"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.867943 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.880955 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x" (OuterVolumeSpecName: "kube-api-access-bkc5x") pod "919752ba-56da-468a-af72-eced98208052" (UID: "919752ba-56da-468a-af72-eced98208052"). InnerVolumeSpecName "kube-api-access-bkc5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.970593 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkc5x\" (UniqueName: \"kubernetes.io/projected/919752ba-56da-468a-af72-eced98208052-kube-api-access-bkc5x\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:56 crc kubenswrapper[4717]: I0308 05:51:56.991005 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919752ba-56da-468a-af72-eced98208052" (UID: "919752ba-56da-468a-af72-eced98208052"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.072014 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919752ba-56da-468a-af72-eced98208052-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.186432 4717 generic.go:334] "Generic (PLEG): container finished" podID="919752ba-56da-468a-af72-eced98208052" containerID="1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81" exitCode=0 Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.186495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerDied","Data":"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81"} Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.186547 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tkcqh" event={"ID":"919752ba-56da-468a-af72-eced98208052","Type":"ContainerDied","Data":"f81634c32bdddd56203437ec7495483b1e07d01bc6d52a5d464b1426430e18c8"} Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.186579 4717 scope.go:117] "RemoveContainer" containerID="1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.186593 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tkcqh" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.212430 4717 scope.go:117] "RemoveContainer" containerID="d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.251755 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.260221 4717 scope.go:117] "RemoveContainer" containerID="2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.270272 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tkcqh"] Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.308584 4717 scope.go:117] "RemoveContainer" containerID="1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81" Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.309726 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81\": container with ID starting with 1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81 not found: ID does not exist" containerID="1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.309784 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81"} err="failed to get container status \"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81\": rpc error: code = NotFound desc = could not find container \"1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81\": container with ID starting with 1cde1594d5b6dfa22ef4fcf22af2c64efda36d7780dc12d21eeb35a52974be81 not found: ID does not exist" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.309815 4717 scope.go:117] "RemoveContainer" containerID="d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3" Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.310674 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3\": container with ID starting with d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3 not found: ID does not exist" containerID="d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.310836 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3"} err="failed to get container status \"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3\": rpc error: code = NotFound desc = could not find container \"d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3\": container with ID starting with d2e319dd72ef878c60e9fda54697868be641c865c51056e7738fe6032ecc12c3 not found: ID does not exist" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.310856 4717 scope.go:117] "RemoveContainer" containerID="2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b" Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.312673 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b\": container with ID starting with 2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b not found: ID does not exist" containerID="2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.312741 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b"} err="failed to get container status \"2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b\": rpc error: code = NotFound desc = could not find container \"2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b\": container with ID starting with 2c0d76d19bc124028b20a0ad6a2c75b3f63293fcf920f789cf32ece510b3787b not found: ID does not exist" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.725877 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.821662 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919752ba-56da-468a-af72-eced98208052" path="/var/lib/kubelet/pods/919752ba-56da-468a-af72-eced98208052/volumes" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.838108 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.838408 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="dnsmasq-dns" containerID="cri-o://290f88ff410ade0662248512c1eeba995eac6f021fc98db7ac1ff100faebfd4d" gracePeriod=10 Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.977713 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9f5648895-t45xw"] Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.978111 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919752ba-56da-468a-af72-eced98208052" containerName="extract-content" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.978129 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919752ba-56da-468a-af72-eced98208052" containerName="extract-content" Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.978141 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919752ba-56da-468a-af72-eced98208052" containerName="extract-utilities" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.978148 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919752ba-56da-468a-af72-eced98208052" containerName="extract-utilities" Mar 08 05:51:57 crc kubenswrapper[4717]: E0308 05:51:57.978157 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.978163 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.978349 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="919752ba-56da-468a-af72-eced98208052" containerName="registry-server" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.982011 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:57 crc kubenswrapper[4717]: I0308 05:51:57.993852 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f5648895-t45xw"] Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098293 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098352 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-openstack-edpm-ipam\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098634 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-config\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-svc\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.098865 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9g6\" (UniqueName: \"kubernetes.io/projected/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-kube-api-access-jh9g6\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.200992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-openstack-edpm-ipam\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.201558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-openstack-edpm-ipam\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-config\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-svc\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9g6\" (UniqueName: \"kubernetes.io/projected/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-kube-api-access-jh9g6\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.202897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.203474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-config\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.204768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.206106 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-dns-svc\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.206521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.211877 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerID="290f88ff410ade0662248512c1eeba995eac6f021fc98db7ac1ff100faebfd4d" exitCode=0 Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.211930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" event={"ID":"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d","Type":"ContainerDied","Data":"290f88ff410ade0662248512c1eeba995eac6f021fc98db7ac1ff100faebfd4d"} Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.225786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9g6\" (UniqueName: \"kubernetes.io/projected/b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9-kube-api-access-jh9g6\") pod \"dnsmasq-dns-9f5648895-t45xw\" (UID: \"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9\") " pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.322593 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.332221 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.513480 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.513795 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.513856 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqkn5\" (UniqueName: \"kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.513917 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.514046 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.514077 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb\") pod \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\" (UID: \"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d\") " Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.539114 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5" (OuterVolumeSpecName: "kube-api-access-zqkn5") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "kube-api-access-zqkn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.575660 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.590377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.591712 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config" (OuterVolumeSpecName: "config") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.596478 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.597017 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" (UID: "9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616775 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616809 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616819 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616827 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616837 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqkn5\" (UniqueName: \"kubernetes.io/projected/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-kube-api-access-zqkn5\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.616846 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:51:58 crc kubenswrapper[4717]: I0308 05:51:58.838917 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f5648895-t45xw"] Mar 08 05:51:58 crc kubenswrapper[4717]: W0308 05:51:58.839193 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a2e6e1_5cab_4e70_b0a2_0d7d6eccc1f9.slice/crio-7883230441d3404e38ebfcb2453dbc2dbf46360832c8c70d19e9dd4015b3c3ee WatchSource:0}: Error finding container 7883230441d3404e38ebfcb2453dbc2dbf46360832c8c70d19e9dd4015b3c3ee: Status 404 returned error can't find the container with id 7883230441d3404e38ebfcb2453dbc2dbf46360832c8c70d19e9dd4015b3c3ee Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.224611 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.224611 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b79f7d5c-dwvdr" event={"ID":"9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d","Type":"ContainerDied","Data":"90f1d2a986b2ec1291df5e4452c6c0a31add797e977fa401a4f20d88bcab3340"} Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.225020 4717 scope.go:117] "RemoveContainer" containerID="290f88ff410ade0662248512c1eeba995eac6f021fc98db7ac1ff100faebfd4d" Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.226365 4717 generic.go:334] "Generic (PLEG): container finished" podID="b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9" containerID="856fba460efa8a0f67aebe3fd1649e7b01637181078efea5c2040cfcb2963c8f" exitCode=0 Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.226391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5648895-t45xw" event={"ID":"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9","Type":"ContainerDied","Data":"856fba460efa8a0f67aebe3fd1649e7b01637181078efea5c2040cfcb2963c8f"} Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.226421 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5648895-t45xw" event={"ID":"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9","Type":"ContainerStarted","Data":"7883230441d3404e38ebfcb2453dbc2dbf46360832c8c70d19e9dd4015b3c3ee"} Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.250980 4717 scope.go:117] "RemoveContainer" containerID="b3b6705875ce1f1de5d557a9e807f71e9b7124a221e9c25638f9a843dde230f7" Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.289469 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.297835 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59b79f7d5c-dwvdr"] Mar 08 05:51:59 crc kubenswrapper[4717]: I0308 05:51:59.814712 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" path="/var/lib/kubelet/pods/9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d/volumes" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.138910 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549152-l8zn2"] Mar 08 05:52:00 crc kubenswrapper[4717]: E0308 05:52:00.139778 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="init" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.139825 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="init" Mar 08 05:52:00 crc kubenswrapper[4717]: E0308 05:52:00.139851 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="dnsmasq-dns" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.139869 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="dnsmasq-dns" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.140408 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6e938f-e9f5-42cb-b0b3-3b3b35ae895d" containerName="dnsmasq-dns" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.141883 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.146053 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.146969 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.147055 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.154237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549152-l8zn2"] Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.240455 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5648895-t45xw" event={"ID":"b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9","Type":"ContainerStarted","Data":"2cd408718ef0d07d15f7f90c8529638f7a6d5a94a5998763fc14dc2b9a993603"} Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.240881 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.246531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnn2\" (UniqueName: \"kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2\") pod \"auto-csr-approver-29549152-l8zn2\" (UID: \"431d1972-afa5-46c3-97f4-2de7cd1a578a\") " pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.269890 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9f5648895-t45xw" podStartSLOduration=3.269864156 podStartE2EDuration="3.269864156s" podCreationTimestamp="2026-03-08 05:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:52:00.258908955 +0000 UTC m=+1547.176557839" watchObservedRunningTime="2026-03-08 05:52:00.269864156 +0000 UTC m=+1547.187513040" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.348626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnn2\" (UniqueName: \"kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2\") pod \"auto-csr-approver-29549152-l8zn2\" (UID: \"431d1972-afa5-46c3-97f4-2de7cd1a578a\") " pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.380631 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnn2\" (UniqueName: \"kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2\") pod \"auto-csr-approver-29549152-l8zn2\" (UID: \"431d1972-afa5-46c3-97f4-2de7cd1a578a\") " pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.476101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:00 crc kubenswrapper[4717]: I0308 05:52:00.984873 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549152-l8zn2"] Mar 08 05:52:01 crc kubenswrapper[4717]: W0308 05:52:00.999857 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod431d1972_afa5_46c3_97f4_2de7cd1a578a.slice/crio-686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66 WatchSource:0}: Error finding container 686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66: Status 404 returned error can't find the container with id 686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66 Mar 08 05:52:01 crc kubenswrapper[4717]: I0308 05:52:01.253642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" event={"ID":"431d1972-afa5-46c3-97f4-2de7cd1a578a","Type":"ContainerStarted","Data":"686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66"} Mar 08 05:52:02 crc kubenswrapper[4717]: I0308 05:52:02.264080 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" event={"ID":"431d1972-afa5-46c3-97f4-2de7cd1a578a","Type":"ContainerStarted","Data":"ee462ee56f5c19d14d4d900eff41855441d1b938f8749802a8f55fb28288db38"} Mar 08 05:52:02 crc kubenswrapper[4717]: I0308 05:52:02.294108 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" podStartSLOduration=1.403955068 podStartE2EDuration="2.294087343s" podCreationTimestamp="2026-03-08 05:52:00 +0000 UTC" firstStartedPulling="2026-03-08 05:52:01.001432414 +0000 UTC m=+1547.919081258" lastFinishedPulling="2026-03-08 05:52:01.891564679 +0000 UTC m=+1548.809213533" observedRunningTime="2026-03-08 05:52:02.280775153 +0000 UTC m=+1549.198424017" watchObservedRunningTime="2026-03-08 05:52:02.294087343 +0000 UTC m=+1549.211736197" Mar 08 05:52:03 crc kubenswrapper[4717]: I0308 05:52:03.280550 4717 generic.go:334] "Generic (PLEG): container finished" podID="431d1972-afa5-46c3-97f4-2de7cd1a578a" containerID="ee462ee56f5c19d14d4d900eff41855441d1b938f8749802a8f55fb28288db38" exitCode=0 Mar 08 05:52:03 crc kubenswrapper[4717]: I0308 05:52:03.280659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" event={"ID":"431d1972-afa5-46c3-97f4-2de7cd1a578a","Type":"ContainerDied","Data":"ee462ee56f5c19d14d4d900eff41855441d1b938f8749802a8f55fb28288db38"} Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.120082 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.120148 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.747413 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.846671 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnn2\" (UniqueName: \"kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2\") pod \"431d1972-afa5-46c3-97f4-2de7cd1a578a\" (UID: \"431d1972-afa5-46c3-97f4-2de7cd1a578a\") " Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.875964 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2" (OuterVolumeSpecName: "kube-api-access-nhnn2") pod "431d1972-afa5-46c3-97f4-2de7cd1a578a" (UID: "431d1972-afa5-46c3-97f4-2de7cd1a578a"). InnerVolumeSpecName "kube-api-access-nhnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:52:04 crc kubenswrapper[4717]: I0308 05:52:04.950412 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnn2\" (UniqueName: \"kubernetes.io/projected/431d1972-afa5-46c3-97f4-2de7cd1a578a-kube-api-access-nhnn2\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.311223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" event={"ID":"431d1972-afa5-46c3-97f4-2de7cd1a578a","Type":"ContainerDied","Data":"686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66"} Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.311281 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549152-l8zn2" Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.311297 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686fca797d0ab08b6dcff15bb2a66bca5c2bf7cb919cb3ea81f8d2715a476c66" Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.367456 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549146-t9dcp"] Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.376900 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549146-t9dcp"] Mar 08 05:52:05 crc kubenswrapper[4717]: I0308 05:52:05.797612 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20fb9c8-328d-494c-b578-4abc028448bd" path="/var/lib/kubelet/pods/c20fb9c8-328d-494c-b578-4abc028448bd/volumes" Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.325952 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9f5648895-t45xw" Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.435045 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.435327 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="dnsmasq-dns" containerID="cri-o://a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88" gracePeriod=10 Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.926217 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981399 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981463 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981528 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgwjp\" (UniqueName: \"kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:08 crc kubenswrapper[4717]: I0308 05:52:08.981677 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam\") pod \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\" (UID: \"45fd46fc-ae39-4ce2-8489-1da3e3178f2f\") " Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.011702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp" (OuterVolumeSpecName: "kube-api-access-wgwjp") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "kube-api-access-wgwjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.045620 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config" (OuterVolumeSpecName: "config") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.063374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.063550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.065196 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.071981 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084162 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgwjp\" (UniqueName: \"kubernetes.io/projected/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-kube-api-access-wgwjp\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084199 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-config\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084214 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084226 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084239 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.084250 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.098974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45fd46fc-ae39-4ce2-8489-1da3e3178f2f" (UID: "45fd46fc-ae39-4ce2-8489-1da3e3178f2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.186098 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45fd46fc-ae39-4ce2-8489-1da3e3178f2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.366107 4717 generic.go:334] "Generic (PLEG): container finished" podID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerID="a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88" exitCode=0 Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.366146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" event={"ID":"45fd46fc-ae39-4ce2-8489-1da3e3178f2f","Type":"ContainerDied","Data":"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88"} Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.366170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" event={"ID":"45fd46fc-ae39-4ce2-8489-1da3e3178f2f","Type":"ContainerDied","Data":"528c293fa1fc295ac6ca4f5d4fc4153c8d6aafa42990c39f4a32eca3b6c335bd"} Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.366184 4717 scope.go:117] "RemoveContainer" containerID="a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.366292 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546cc8c9-nzj7h" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.403108 4717 scope.go:117] "RemoveContainer" containerID="a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.406004 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.417567 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546cc8c9-nzj7h"] Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.421231 4717 scope.go:117] "RemoveContainer" containerID="a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88" Mar 08 05:52:09 crc kubenswrapper[4717]: E0308 05:52:09.421756 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88\": container with ID starting with a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88 not found: ID does not exist" containerID="a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.421813 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88"} err="failed to get container status \"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88\": rpc error: code = NotFound desc = could not find container \"a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88\": container with ID starting with a42ed615766cfe4d478eb030b5ea4f85d589187b1cc4302e84879e984d038c88 not found: ID does not exist" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.421838 4717 scope.go:117] "RemoveContainer" containerID="a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2" Mar 08 05:52:09 crc kubenswrapper[4717]: E0308 05:52:09.422221 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2\": container with ID starting with a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2 not found: ID does not exist" containerID="a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.422252 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2"} err="failed to get container status \"a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2\": rpc error: code = NotFound desc = could not find container \"a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2\": container with ID starting with a6ba1e74c413bbaa254e17c83253fce4c81bf38073dd59c55da4b75489dcdfc2 not found: ID does not exist" Mar 08 05:52:09 crc kubenswrapper[4717]: I0308 05:52:09.800201 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" path="/var/lib/kubelet/pods/45fd46fc-ae39-4ce2-8489-1da3e3178f2f/volumes" Mar 08 05:52:16 crc kubenswrapper[4717]: I0308 05:52:16.450506 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1331e99-d131-4f8c-ae4e-6217cf54ddaf" containerID="84b2ed6169558aac6b0fd1d92039e0f28181e491d33a147160d69ab424e88b10" exitCode=0 Mar 08 05:52:16 crc kubenswrapper[4717]: I0308 05:52:16.450617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1331e99-d131-4f8c-ae4e-6217cf54ddaf","Type":"ContainerDied","Data":"84b2ed6169558aac6b0fd1d92039e0f28181e491d33a147160d69ab424e88b10"} Mar 08 05:52:16 crc kubenswrapper[4717]: I0308 05:52:16.454984 4717 generic.go:334] "Generic (PLEG): container finished" podID="4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7" containerID="3a0caf0c1ba783f6a3d4b40d25ddc69a2138a7db7ce8238543a8ab5927d88a44" exitCode=0 Mar 08 05:52:16 crc kubenswrapper[4717]: I0308 05:52:16.455055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7","Type":"ContainerDied","Data":"3a0caf0c1ba783f6a3d4b40d25ddc69a2138a7db7ce8238543a8ab5927d88a44"} Mar 08 05:52:17 crc kubenswrapper[4717]: I0308 05:52:17.469025 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1331e99-d131-4f8c-ae4e-6217cf54ddaf","Type":"ContainerStarted","Data":"51cb425eb280b93b4afc85afed2ed0a68a0e91a0ef0a7ca02a66cb1e35c4d499"} Mar 08 05:52:17 crc kubenswrapper[4717]: I0308 05:52:17.470290 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 05:52:17 crc kubenswrapper[4717]: I0308 05:52:17.472641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7","Type":"ContainerStarted","Data":"435a3c184862c524fa9ae1987e3ce1a105127eec3bf15366c1350250678e2ff7"} Mar 08 05:52:17 crc kubenswrapper[4717]: I0308 05:52:17.472933 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:52:17 crc kubenswrapper[4717]: I0308 05:52:17.503330 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.503311487 podStartE2EDuration="37.503311487s" podCreationTimestamp="2026-03-08 05:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:52:17.492816568 +0000 UTC m=+1564.410465432" watchObservedRunningTime="2026-03-08 05:52:17.503311487 +0000 UTC m=+1564.420960331" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.111338 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.111315392 podStartE2EDuration="41.111315392s" podCreationTimestamp="2026-03-08 05:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 05:52:17.523084686 +0000 UTC m=+1564.440733540" watchObservedRunningTime="2026-03-08 05:52:21.111315392 +0000 UTC m=+1568.028964236" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.122337 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h"] Mar 08 05:52:21 crc kubenswrapper[4717]: E0308 05:52:21.122843 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="init" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.122864 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="init" Mar 08 05:52:21 crc kubenswrapper[4717]: E0308 05:52:21.122896 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="431d1972-afa5-46c3-97f4-2de7cd1a578a" containerName="oc" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.122905 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="431d1972-afa5-46c3-97f4-2de7cd1a578a" containerName="oc" Mar 08 05:52:21 crc kubenswrapper[4717]: E0308 05:52:21.122935 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="dnsmasq-dns" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.122943 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="dnsmasq-dns" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.123197 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="431d1972-afa5-46c3-97f4-2de7cd1a578a" containerName="oc" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.123231 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fd46fc-ae39-4ce2-8489-1da3e3178f2f" containerName="dnsmasq-dns" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.124176 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.130630 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.132032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.132175 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.133392 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.138446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h"] Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.249052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr4pp\" (UniqueName: \"kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.249494 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.249806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.249870 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.351265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr4pp\" (UniqueName: \"kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.351321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.351427 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.351448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.357121 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.357412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.358215 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.370568 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr4pp\" (UniqueName: \"kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:21 crc kubenswrapper[4717]: I0308 05:52:21.447705 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:22 crc kubenswrapper[4717]: I0308 05:52:22.153813 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h"] Mar 08 05:52:22 crc kubenswrapper[4717]: I0308 05:52:22.518847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" event={"ID":"9cff8432-fc11-45e4-9e59-9abfdc356b44","Type":"ContainerStarted","Data":"5f933d67a1b10eccbbe35fbd15d623a86d1ba373e26c0dc320d7e612023cacac"} Mar 08 05:52:26 crc kubenswrapper[4717]: I0308 05:52:26.603766 4717 scope.go:117] "RemoveContainer" containerID="46a3f6ec15a16c4fee0133afccc02980a0d3834b1cb31a4698198b560a3df85b" Mar 08 05:52:30 crc kubenswrapper[4717]: I0308 05:52:30.381259 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c1331e99-d131-4f8c-ae4e-6217cf54ddaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.241:5671: connect: connection refused" Mar 08 05:52:30 crc kubenswrapper[4717]: I0308 05:52:30.602827 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5671: connect: connection refused" Mar 08 05:52:33 crc kubenswrapper[4717]: I0308 05:52:33.664043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" event={"ID":"9cff8432-fc11-45e4-9e59-9abfdc356b44","Type":"ContainerStarted","Data":"5d3e240c94190fc427d6fbb3c979cb2df5e357d8f1fc5fd193cee3648e3025c9"} Mar 08 05:52:33 crc kubenswrapper[4717]: I0308 05:52:33.700487 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" podStartSLOduration=2.305126279 podStartE2EDuration="12.700460891s" podCreationTimestamp="2026-03-08 05:52:21 +0000 UTC" firstStartedPulling="2026-03-08 05:52:22.13919864 +0000 UTC m=+1569.056847534" lastFinishedPulling="2026-03-08 05:52:32.534533252 +0000 UTC m=+1579.452182146" observedRunningTime="2026-03-08 05:52:33.681479412 +0000 UTC m=+1580.599128286" watchObservedRunningTime="2026-03-08 05:52:33.700460891 +0000 UTC m=+1580.618109765" Mar 08 05:52:34 crc kubenswrapper[4717]: I0308 05:52:34.121098 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:52:34 crc kubenswrapper[4717]: I0308 05:52:34.121167 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:52:40 crc kubenswrapper[4717]: I0308 05:52:40.377933 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 05:52:40 crc kubenswrapper[4717]: I0308 05:52:40.601851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 05:52:43 crc kubenswrapper[4717]: I0308 05:52:43.781216 4717 generic.go:334] "Generic (PLEG): container finished" podID="9cff8432-fc11-45e4-9e59-9abfdc356b44" containerID="5d3e240c94190fc427d6fbb3c979cb2df5e357d8f1fc5fd193cee3648e3025c9" exitCode=0 Mar 08 05:52:43 crc kubenswrapper[4717]: I0308 05:52:43.799634 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" event={"ID":"9cff8432-fc11-45e4-9e59-9abfdc356b44","Type":"ContainerDied","Data":"5d3e240c94190fc427d6fbb3c979cb2df5e357d8f1fc5fd193cee3648e3025c9"} Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.334383 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.405421 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle\") pod \"9cff8432-fc11-45e4-9e59-9abfdc356b44\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.405483 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory\") pod \"9cff8432-fc11-45e4-9e59-9abfdc356b44\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.405522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr4pp\" (UniqueName: \"kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp\") pod \"9cff8432-fc11-45e4-9e59-9abfdc356b44\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.405642 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam\") pod \"9cff8432-fc11-45e4-9e59-9abfdc356b44\" (UID: \"9cff8432-fc11-45e4-9e59-9abfdc356b44\") " Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.542242 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp" (OuterVolumeSpecName: "kube-api-access-nr4pp") pod "9cff8432-fc11-45e4-9e59-9abfdc356b44" (UID: "9cff8432-fc11-45e4-9e59-9abfdc356b44"). InnerVolumeSpecName "kube-api-access-nr4pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.542285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9cff8432-fc11-45e4-9e59-9abfdc356b44" (UID: "9cff8432-fc11-45e4-9e59-9abfdc356b44"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.550608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory" (OuterVolumeSpecName: "inventory") pod "9cff8432-fc11-45e4-9e59-9abfdc356b44" (UID: "9cff8432-fc11-45e4-9e59-9abfdc356b44"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.557820 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cff8432-fc11-45e4-9e59-9abfdc356b44" (UID: "9cff8432-fc11-45e4-9e59-9abfdc356b44"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.610721 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.610755 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.610766 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr4pp\" (UniqueName: \"kubernetes.io/projected/9cff8432-fc11-45e4-9e59-9abfdc356b44-kube-api-access-nr4pp\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.610777 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cff8432-fc11-45e4-9e59-9abfdc356b44-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.811018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" event={"ID":"9cff8432-fc11-45e4-9e59-9abfdc356b44","Type":"ContainerDied","Data":"5f933d67a1b10eccbbe35fbd15d623a86d1ba373e26c0dc320d7e612023cacac"} Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.811078 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f933d67a1b10eccbbe35fbd15d623a86d1ba373e26c0dc320d7e612023cacac" Mar 08 05:52:45 crc kubenswrapper[4717]: I0308 05:52:45.811106 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.037117 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2"] Mar 08 05:52:46 crc kubenswrapper[4717]: E0308 05:52:46.037884 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cff8432-fc11-45e4-9e59-9abfdc356b44" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.037904 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cff8432-fc11-45e4-9e59-9abfdc356b44" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.038140 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cff8432-fc11-45e4-9e59-9abfdc356b44" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.039041 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.040662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.041359 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.041674 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.041901 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.052204 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2"] Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.120531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.120814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.120920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqp24\" (UniqueName: \"kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.223264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.223403 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.223452 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqp24\" (UniqueName: \"kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.228465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.229150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.253940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqp24\" (UniqueName: \"kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kqcr2\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.395710 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.412049 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.417872 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.448351 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.531124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.531425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbv47\" (UniqueName: \"kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.531469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.633127 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.633179 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbv47\" (UniqueName: \"kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.633208 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.634267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.634373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.654206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbv47\" (UniqueName: \"kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47\") pod \"community-operators-jzg8n\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:46 crc kubenswrapper[4717]: I0308 05:52:46.827868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.018359 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2"] Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.280633 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:52:47 crc kubenswrapper[4717]: W0308 05:52:47.285996 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a8a719_1cf0_49dd_906d_503052f97a9f.slice/crio-32d52826066a286bc798c5915074acddf8272528bd57683d9cd4cdf75f949c17 WatchSource:0}: Error finding container 32d52826066a286bc798c5915074acddf8272528bd57683d9cd4cdf75f949c17: Status 404 returned error can't find the container with id 32d52826066a286bc798c5915074acddf8272528bd57683d9cd4cdf75f949c17 Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.841200 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerID="e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78" exitCode=0 Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.841312 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerDied","Data":"e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78"} Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.841375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerStarted","Data":"32d52826066a286bc798c5915074acddf8272528bd57683d9cd4cdf75f949c17"} Mar 08 05:52:47 crc kubenswrapper[4717]: I0308 05:52:47.844488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" event={"ID":"9af81be1-1bd6-46d1-ab21-d61cd769fd21","Type":"ContainerStarted","Data":"53bd564645a2e6710cb5587ae6c009ae1e524bb41b899b517cf98c19bf28ed48"} Mar 08 05:52:48 crc kubenswrapper[4717]: I0308 05:52:48.863532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerStarted","Data":"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20"} Mar 08 05:52:48 crc kubenswrapper[4717]: I0308 05:52:48.866988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" event={"ID":"9af81be1-1bd6-46d1-ab21-d61cd769fd21","Type":"ContainerStarted","Data":"bf9cc9aa8ec7e5f2a850e08502e2d9274769c7a4353bd5e5e4ad61346019d375"} Mar 08 05:52:48 crc kubenswrapper[4717]: I0308 05:52:48.921844 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" podStartSLOduration=2.180322694 podStartE2EDuration="2.921815575s" podCreationTimestamp="2026-03-08 05:52:46 +0000 UTC" firstStartedPulling="2026-03-08 05:52:47.012353272 +0000 UTC m=+1593.930002156" lastFinishedPulling="2026-03-08 05:52:47.753846163 +0000 UTC m=+1594.671495037" observedRunningTime="2026-03-08 05:52:48.914666878 +0000 UTC m=+1595.832315722" watchObservedRunningTime="2026-03-08 05:52:48.921815575 +0000 UTC m=+1595.839464419" Mar 08 05:52:49 crc kubenswrapper[4717]: I0308 05:52:49.887591 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerID="ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20" exitCode=0 Mar 08 05:52:49 crc kubenswrapper[4717]: I0308 05:52:49.887752 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerDied","Data":"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20"} Mar 08 05:52:50 crc kubenswrapper[4717]: I0308 05:52:50.898597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerStarted","Data":"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93"} Mar 08 05:52:50 crc kubenswrapper[4717]: I0308 05:52:50.900058 4717 generic.go:334] "Generic (PLEG): container finished" podID="9af81be1-1bd6-46d1-ab21-d61cd769fd21" containerID="bf9cc9aa8ec7e5f2a850e08502e2d9274769c7a4353bd5e5e4ad61346019d375" exitCode=0 Mar 08 05:52:50 crc kubenswrapper[4717]: I0308 05:52:50.900104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" event={"ID":"9af81be1-1bd6-46d1-ab21-d61cd769fd21","Type":"ContainerDied","Data":"bf9cc9aa8ec7e5f2a850e08502e2d9274769c7a4353bd5e5e4ad61346019d375"} Mar 08 05:52:50 crc kubenswrapper[4717]: I0308 05:52:50.921147 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jzg8n" podStartSLOduration=2.333050075 podStartE2EDuration="4.921128827s" podCreationTimestamp="2026-03-08 05:52:46 +0000 UTC" firstStartedPulling="2026-03-08 05:52:47.845886438 +0000 UTC m=+1594.763535282" lastFinishedPulling="2026-03-08 05:52:50.43396515 +0000 UTC m=+1597.351614034" observedRunningTime="2026-03-08 05:52:50.914973395 +0000 UTC m=+1597.832622239" watchObservedRunningTime="2026-03-08 05:52:50.921128827 +0000 UTC m=+1597.838777671" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.491161 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.583898 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory\") pod \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.584038 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam\") pod \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.584158 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqp24\" (UniqueName: \"kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24\") pod \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\" (UID: \"9af81be1-1bd6-46d1-ab21-d61cd769fd21\") " Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.591722 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24" (OuterVolumeSpecName: "kube-api-access-jqp24") pod "9af81be1-1bd6-46d1-ab21-d61cd769fd21" (UID: "9af81be1-1bd6-46d1-ab21-d61cd769fd21"). InnerVolumeSpecName "kube-api-access-jqp24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.618026 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9af81be1-1bd6-46d1-ab21-d61cd769fd21" (UID: "9af81be1-1bd6-46d1-ab21-d61cd769fd21"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.620571 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory" (OuterVolumeSpecName: "inventory") pod "9af81be1-1bd6-46d1-ab21-d61cd769fd21" (UID: "9af81be1-1bd6-46d1-ab21-d61cd769fd21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.687429 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.687483 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af81be1-1bd6-46d1-ab21-d61cd769fd21-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.687502 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqp24\" (UniqueName: \"kubernetes.io/projected/9af81be1-1bd6-46d1-ab21-d61cd769fd21-kube-api-access-jqp24\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.927529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" event={"ID":"9af81be1-1bd6-46d1-ab21-d61cd769fd21","Type":"ContainerDied","Data":"53bd564645a2e6710cb5587ae6c009ae1e524bb41b899b517cf98c19bf28ed48"} Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.927573 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kqcr2" Mar 08 05:52:52 crc kubenswrapper[4717]: I0308 05:52:52.927585 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53bd564645a2e6710cb5587ae6c009ae1e524bb41b899b517cf98c19bf28ed48" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.032257 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz"] Mar 08 05:52:53 crc kubenswrapper[4717]: E0308 05:52:53.032636 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af81be1-1bd6-46d1-ab21-d61cd769fd21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.032653 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af81be1-1bd6-46d1-ab21-d61cd769fd21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.032906 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af81be1-1bd6-46d1-ab21-d61cd769fd21" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.033666 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.036678 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.037245 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.037723 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.040124 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.066890 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz"] Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.196588 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.196661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.196772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.196952 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nvgl\" (UniqueName: \"kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.299252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.299331 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.299377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.299414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nvgl\" (UniqueName: \"kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.303423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.305168 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.312527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.317985 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nvgl\" (UniqueName: \"kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.361568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:52:53 crc kubenswrapper[4717]: I0308 05:52:53.947409 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz"] Mar 08 05:52:54 crc kubenswrapper[4717]: I0308 05:52:54.964750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" event={"ID":"d8682143-56c7-442e-987a-d9da77fbe879","Type":"ContainerStarted","Data":"23e9c0e166f12aaf13e078249d655a5019d7073d863e35519a9fb2caadd4277e"} Mar 08 05:52:55 crc kubenswrapper[4717]: I0308 05:52:55.976785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" event={"ID":"d8682143-56c7-442e-987a-d9da77fbe879","Type":"ContainerStarted","Data":"8b8eac8a51c9e32ada85d8fe5db3c7e04fe057777d5dd0bc0d3c1b28ce67812f"} Mar 08 05:52:56 crc kubenswrapper[4717]: I0308 05:52:56.018867 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" podStartSLOduration=2.58285132 podStartE2EDuration="3.018839413s" podCreationTimestamp="2026-03-08 05:52:53 +0000 UTC" firstStartedPulling="2026-03-08 05:52:53.957970979 +0000 UTC m=+1600.875619823" lastFinishedPulling="2026-03-08 05:52:54.393959022 +0000 UTC m=+1601.311607916" observedRunningTime="2026-03-08 05:52:55.999445154 +0000 UTC m=+1602.917094048" watchObservedRunningTime="2026-03-08 05:52:56.018839413 +0000 UTC m=+1602.936488297" Mar 08 05:52:56 crc kubenswrapper[4717]: I0308 05:52:56.832842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:56 crc kubenswrapper[4717]: I0308 05:52:56.833191 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:56 crc kubenswrapper[4717]: I0308 05:52:56.881422 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:57 crc kubenswrapper[4717]: I0308 05:52:57.055944 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:57 crc kubenswrapper[4717]: I0308 05:52:57.121486 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.015510 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jzg8n" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="registry-server" containerID="cri-o://4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93" gracePeriod=2 Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.512525 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.640852 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbv47\" (UniqueName: \"kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47\") pod \"d4a8a719-1cf0-49dd-906d-503052f97a9f\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.640969 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content\") pod \"d4a8a719-1cf0-49dd-906d-503052f97a9f\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.641246 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities\") pod \"d4a8a719-1cf0-49dd-906d-503052f97a9f\" (UID: \"d4a8a719-1cf0-49dd-906d-503052f97a9f\") " Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.642823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities" (OuterVolumeSpecName: "utilities") pod "d4a8a719-1cf0-49dd-906d-503052f97a9f" (UID: "d4a8a719-1cf0-49dd-906d-503052f97a9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.647938 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47" (OuterVolumeSpecName: "kube-api-access-vbv47") pod "d4a8a719-1cf0-49dd-906d-503052f97a9f" (UID: "d4a8a719-1cf0-49dd-906d-503052f97a9f"). InnerVolumeSpecName "kube-api-access-vbv47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.719269 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4a8a719-1cf0-49dd-906d-503052f97a9f" (UID: "d4a8a719-1cf0-49dd-906d-503052f97a9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.743524 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.743566 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbv47\" (UniqueName: \"kubernetes.io/projected/d4a8a719-1cf0-49dd-906d-503052f97a9f-kube-api-access-vbv47\") on node \"crc\" DevicePath \"\"" Mar 08 05:52:59 crc kubenswrapper[4717]: I0308 05:52:59.743581 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a8a719-1cf0-49dd-906d-503052f97a9f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.033295 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerID="4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93" exitCode=0 Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.033364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerDied","Data":"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93"} Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.033390 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzg8n" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.033414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzg8n" event={"ID":"d4a8a719-1cf0-49dd-906d-503052f97a9f","Type":"ContainerDied","Data":"32d52826066a286bc798c5915074acddf8272528bd57683d9cd4cdf75f949c17"} Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.033453 4717 scope.go:117] "RemoveContainer" containerID="4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.079916 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.089007 4717 scope.go:117] "RemoveContainer" containerID="ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.090060 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jzg8n"] Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.133998 4717 scope.go:117] "RemoveContainer" containerID="e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.185179 4717 scope.go:117] "RemoveContainer" containerID="4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93" Mar 08 05:53:00 crc kubenswrapper[4717]: E0308 05:53:00.186153 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93\": container with ID starting with 4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93 not found: ID does not exist" containerID="4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.186210 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93"} err="failed to get container status \"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93\": rpc error: code = NotFound desc = could not find container \"4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93\": container with ID starting with 4f72624fec09d90ba2a79f8844ce644c6472520f501522173dd0c8c2a46aea93 not found: ID does not exist" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.186260 4717 scope.go:117] "RemoveContainer" containerID="ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20" Mar 08 05:53:00 crc kubenswrapper[4717]: E0308 05:53:00.186785 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20\": container with ID starting with ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20 not found: ID does not exist" containerID="ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.186829 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20"} err="failed to get container status \"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20\": rpc error: code = NotFound desc = could not find container \"ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20\": container with ID starting with ff76363cbb61168aaecee6cb2fd1a8a93b8bfd9d42a1627c34ceee7f4f206f20 not found: ID does not exist" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.186873 4717 scope.go:117] "RemoveContainer" containerID="e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78" Mar 08 05:53:00 crc kubenswrapper[4717]: E0308 05:53:00.187318 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78\": container with ID starting with e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78 not found: ID does not exist" containerID="e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78" Mar 08 05:53:00 crc kubenswrapper[4717]: I0308 05:53:00.187356 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78"} err="failed to get container status \"e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78\": rpc error: code = NotFound desc = could not find container \"e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78\": container with ID starting with e6b7fa848b428eb618b3abb539e6218547fcdb80e3393eda3f79cfca1bc16f78 not found: ID does not exist" Mar 08 05:53:01 crc kubenswrapper[4717]: I0308 05:53:01.803319 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" path="/var/lib/kubelet/pods/d4a8a719-1cf0-49dd-906d-503052f97a9f/volumes" Mar 08 05:53:04 crc kubenswrapper[4717]: I0308 05:53:04.120381 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 05:53:04 crc kubenswrapper[4717]: I0308 05:53:04.120904 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 05:53:04 crc kubenswrapper[4717]: I0308 05:53:04.120964 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 05:53:04 crc kubenswrapper[4717]: I0308 05:53:04.121886 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 05:53:04 crc kubenswrapper[4717]: I0308 05:53:04.121955 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" gracePeriod=600 Mar 08 05:53:04 crc kubenswrapper[4717]: E0308 05:53:04.251650 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:53:05 crc kubenswrapper[4717]: I0308 05:53:05.127390 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" exitCode=0 Mar 08 05:53:05 crc kubenswrapper[4717]: I0308 05:53:05.127499 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b"} Mar 08 05:53:05 crc kubenswrapper[4717]: I0308 05:53:05.127866 4717 scope.go:117] "RemoveContainer" containerID="fdc38828b70d25a0ccd54dcdac75ca0eebf8f58cb86023b869d6450eb8241d7e" Mar 08 05:53:05 crc kubenswrapper[4717]: I0308 05:53:05.128741 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:53:05 crc kubenswrapper[4717]: E0308 05:53:05.129310 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:53:18 crc kubenswrapper[4717]: I0308 05:53:18.781540 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:53:18 crc kubenswrapper[4717]: E0308 05:53:18.782836 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:53:30 crc kubenswrapper[4717]: I0308 05:53:30.782352 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:53:30 crc kubenswrapper[4717]: E0308 05:53:30.783321 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:53:32 crc kubenswrapper[4717]: I0308 05:53:32.605044 4717 scope.go:117] "RemoveContainer" containerID="b6b97e7aa1da0b0cd74f5b9174ef3ad4c02d3e79f5545e8f1424a57c60a08ac9" Mar 08 05:53:32 crc kubenswrapper[4717]: I0308 05:53:32.638413 4717 scope.go:117] "RemoveContainer" containerID="f8f39680a411ecd5f389c31818c196277ab7a372cccccbd8c096d4ced38e9d4f" Mar 08 05:53:32 crc kubenswrapper[4717]: I0308 05:53:32.716304 4717 scope.go:117] "RemoveContainer" containerID="9c3a930dcf3788ab3fffa14759e926c172c7fce0734dcf7a0de30f8e282b88d1" Mar 08 05:53:44 crc kubenswrapper[4717]: I0308 05:53:44.787037 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:53:44 crc kubenswrapper[4717]: E0308 05:53:44.788916 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:53:58 crc kubenswrapper[4717]: I0308 05:53:58.782315 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:53:58 crc kubenswrapper[4717]: E0308 05:53:58.783839 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.157969 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549154-5zhm4"] Mar 08 05:54:00 crc kubenswrapper[4717]: E0308 05:54:00.158426 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="registry-server" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.158442 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="registry-server" Mar 08 05:54:00 crc kubenswrapper[4717]: E0308 05:54:00.158470 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="extract-content" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.158478 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="extract-content" Mar 08 05:54:00 crc kubenswrapper[4717]: E0308 05:54:00.158503 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="extract-utilities" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.158511 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="extract-utilities" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.158903 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a8a719-1cf0-49dd-906d-503052f97a9f" containerName="registry-server" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.159867 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.164049 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.164088 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.164132 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.178165 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549154-5zhm4"] Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.265807 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lh77\" (UniqueName: \"kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77\") pod \"auto-csr-approver-29549154-5zhm4\" (UID: \"447edb16-b654-4e0c-8a6a-88b08b829b9a\") " pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.370249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lh77\" (UniqueName: \"kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77\") pod \"auto-csr-approver-29549154-5zhm4\" (UID: \"447edb16-b654-4e0c-8a6a-88b08b829b9a\") " pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.391152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lh77\" (UniqueName: \"kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77\") pod \"auto-csr-approver-29549154-5zhm4\" (UID: \"447edb16-b654-4e0c-8a6a-88b08b829b9a\") " pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:00 crc kubenswrapper[4717]: I0308 05:54:00.516841 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:01 crc kubenswrapper[4717]: I0308 05:54:01.045589 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549154-5zhm4"] Mar 08 05:54:01 crc kubenswrapper[4717]: I0308 05:54:01.406130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" event={"ID":"447edb16-b654-4e0c-8a6a-88b08b829b9a","Type":"ContainerStarted","Data":"1a3c123763afd9d6145fa1a2111b2cf671585bbbc51535b292bf1afebbc493d2"} Mar 08 05:54:02 crc kubenswrapper[4717]: I0308 05:54:02.417374 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" event={"ID":"447edb16-b654-4e0c-8a6a-88b08b829b9a","Type":"ContainerStarted","Data":"1d9cd15406bfd90dcaf8552ffc44b8b4eeb86577a42d5015e284cee5025389c1"} Mar 08 05:54:02 crc kubenswrapper[4717]: I0308 05:54:02.450931 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" podStartSLOduration=1.4619226269999999 podStartE2EDuration="2.450910475s" podCreationTimestamp="2026-03-08 05:54:00 +0000 UTC" firstStartedPulling="2026-03-08 05:54:01.04358248 +0000 UTC m=+1667.961231324" lastFinishedPulling="2026-03-08 05:54:02.032570288 +0000 UTC m=+1668.950219172" observedRunningTime="2026-03-08 05:54:02.433446603 +0000 UTC m=+1669.351095477" watchObservedRunningTime="2026-03-08 05:54:02.450910475 +0000 UTC m=+1669.368559339" Mar 08 05:54:03 crc kubenswrapper[4717]: I0308 05:54:03.430780 4717 generic.go:334] "Generic (PLEG): container finished" podID="447edb16-b654-4e0c-8a6a-88b08b829b9a" containerID="1d9cd15406bfd90dcaf8552ffc44b8b4eeb86577a42d5015e284cee5025389c1" exitCode=0 Mar 08 05:54:03 crc kubenswrapper[4717]: I0308 05:54:03.430828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" event={"ID":"447edb16-b654-4e0c-8a6a-88b08b829b9a","Type":"ContainerDied","Data":"1d9cd15406bfd90dcaf8552ffc44b8b4eeb86577a42d5015e284cee5025389c1"} Mar 08 05:54:04 crc kubenswrapper[4717]: I0308 05:54:04.854678 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:04 crc kubenswrapper[4717]: I0308 05:54:04.974126 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lh77\" (UniqueName: \"kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77\") pod \"447edb16-b654-4e0c-8a6a-88b08b829b9a\" (UID: \"447edb16-b654-4e0c-8a6a-88b08b829b9a\") " Mar 08 05:54:04 crc kubenswrapper[4717]: I0308 05:54:04.979882 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77" (OuterVolumeSpecName: "kube-api-access-6lh77") pod "447edb16-b654-4e0c-8a6a-88b08b829b9a" (UID: "447edb16-b654-4e0c-8a6a-88b08b829b9a"). InnerVolumeSpecName "kube-api-access-6lh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.076947 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lh77\" (UniqueName: \"kubernetes.io/projected/447edb16-b654-4e0c-8a6a-88b08b829b9a-kube-api-access-6lh77\") on node \"crc\" DevicePath \"\"" Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.456249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" event={"ID":"447edb16-b654-4e0c-8a6a-88b08b829b9a","Type":"ContainerDied","Data":"1a3c123763afd9d6145fa1a2111b2cf671585bbbc51535b292bf1afebbc493d2"} Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.456292 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3c123763afd9d6145fa1a2111b2cf671585bbbc51535b292bf1afebbc493d2" Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.456395 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549154-5zhm4" Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.555071 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549148-nrst9"] Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.566108 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549148-nrst9"] Mar 08 05:54:05 crc kubenswrapper[4717]: I0308 05:54:05.803936 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a9f655-96fd-4f95-bbd3-5bbf8db0faa5" path="/var/lib/kubelet/pods/90a9f655-96fd-4f95-bbd3-5bbf8db0faa5/volumes" Mar 08 05:54:10 crc kubenswrapper[4717]: I0308 05:54:10.781528 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:54:10 crc kubenswrapper[4717]: E0308 05:54:10.782157 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:54:22 crc kubenswrapper[4717]: I0308 05:54:22.782766 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:54:22 crc kubenswrapper[4717]: E0308 05:54:22.783976 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:54:32 crc kubenswrapper[4717]: I0308 05:54:32.889129 4717 scope.go:117] "RemoveContainer" containerID="20143731b1089e4e5df5f39c5474d218cc338bc7f42a3ac3401f869d4dfd0375" Mar 08 05:54:32 crc kubenswrapper[4717]: I0308 05:54:32.935652 4717 scope.go:117] "RemoveContainer" containerID="027e1cd56a577cf8e15a05dc5f68938811b1fb8b11346d437ee869cfbd1124af" Mar 08 05:54:32 crc kubenswrapper[4717]: I0308 05:54:32.977519 4717 scope.go:117] "RemoveContainer" containerID="56d6adbe1a47bc739f60662d8c5fcff494f54d3c7e7935ce9d54cdbdb0582939" Mar 08 05:54:33 crc kubenswrapper[4717]: I0308 05:54:33.010263 4717 scope.go:117] "RemoveContainer" containerID="14cd17fcca283abbc714961f8eb0a5659f169538f6fdbff2f48cbae56f3280d9" Mar 08 05:54:33 crc kubenswrapper[4717]: I0308 05:54:33.082326 4717 scope.go:117] "RemoveContainer" containerID="c66da218ef4dcfd150c269a841cdaf78fa977264bfea9a42b75c0f73aebc2824" Mar 08 05:54:33 crc kubenswrapper[4717]: I0308 05:54:33.116778 4717 scope.go:117] "RemoveContainer" containerID="9d026ad3aa94009a1b17e448af4616c4bfb226b1a1ccc50684106646ea69f8c6" Mar 08 05:54:33 crc kubenswrapper[4717]: I0308 05:54:33.179533 4717 scope.go:117] "RemoveContainer" containerID="51a17d9acbc55b1122dcd14627309e3fda8cdc98435b465ccbc287efb5d05b49" Mar 08 05:54:37 crc kubenswrapper[4717]: I0308 05:54:37.781978 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:54:37 crc kubenswrapper[4717]: E0308 05:54:37.782873 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:54:52 crc kubenswrapper[4717]: I0308 05:54:52.782779 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:54:52 crc kubenswrapper[4717]: E0308 05:54:52.784037 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.805756 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:54:56 crc kubenswrapper[4717]: E0308 05:54:56.808339 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447edb16-b654-4e0c-8a6a-88b08b829b9a" containerName="oc" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.808499 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="447edb16-b654-4e0c-8a6a-88b08b829b9a" containerName="oc" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.808971 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="447edb16-b654-4e0c-8a6a-88b08b829b9a" containerName="oc" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.811473 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.858520 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.866399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpwb\" (UniqueName: \"kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.866457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.866542 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.972443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpwb\" (UniqueName: \"kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.972596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.972879 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.974405 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.974772 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:56 crc kubenswrapper[4717]: I0308 05:54:56.994615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpwb\" (UniqueName: \"kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb\") pod \"certified-operators-rcz2z\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:57 crc kubenswrapper[4717]: I0308 05:54:57.162924 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:54:57 crc kubenswrapper[4717]: I0308 05:54:57.642562 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:54:58 crc kubenswrapper[4717]: I0308 05:54:58.185229 4717 generic.go:334] "Generic (PLEG): container finished" podID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerID="56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8" exitCode=0 Mar 08 05:54:58 crc kubenswrapper[4717]: I0308 05:54:58.185281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerDied","Data":"56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8"} Mar 08 05:54:58 crc kubenswrapper[4717]: I0308 05:54:58.185310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerStarted","Data":"ebee16692d69671f84c65fd903b69aeb659ad763fa4e6db286e6078f9993ce31"} Mar 08 05:54:59 crc kubenswrapper[4717]: I0308 05:54:59.197272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerStarted","Data":"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162"} Mar 08 05:55:01 crc kubenswrapper[4717]: I0308 05:55:01.225753 4717 generic.go:334] "Generic (PLEG): container finished" podID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerID="af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162" exitCode=0 Mar 08 05:55:01 crc kubenswrapper[4717]: I0308 05:55:01.225848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerDied","Data":"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162"} Mar 08 05:55:02 crc kubenswrapper[4717]: I0308 05:55:02.238034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerStarted","Data":"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818"} Mar 08 05:55:02 crc kubenswrapper[4717]: I0308 05:55:02.268902 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcz2z" podStartSLOduration=2.824699666 podStartE2EDuration="6.268880434s" podCreationTimestamp="2026-03-08 05:54:56 +0000 UTC" firstStartedPulling="2026-03-08 05:54:58.188016826 +0000 UTC m=+1725.105665670" lastFinishedPulling="2026-03-08 05:55:01.632197594 +0000 UTC m=+1728.549846438" observedRunningTime="2026-03-08 05:55:02.25938681 +0000 UTC m=+1729.177035684" watchObservedRunningTime="2026-03-08 05:55:02.268880434 +0000 UTC m=+1729.186529288" Mar 08 05:55:04 crc kubenswrapper[4717]: I0308 05:55:04.782137 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:55:04 crc kubenswrapper[4717]: E0308 05:55:04.782695 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:55:07 crc kubenswrapper[4717]: I0308 05:55:07.164032 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:07 crc kubenswrapper[4717]: I0308 05:55:07.164370 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:07 crc kubenswrapper[4717]: I0308 05:55:07.246738 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:07 crc kubenswrapper[4717]: I0308 05:55:07.338880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:07 crc kubenswrapper[4717]: I0308 05:55:07.491700 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.315362 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcz2z" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="registry-server" containerID="cri-o://0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818" gracePeriod=2 Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.817516 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.865877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content\") pod \"72c01da6-6934-44b4-b9d5-1668557d9f8e\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.866113 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities\") pod \"72c01da6-6934-44b4-b9d5-1668557d9f8e\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.866142 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnpwb\" (UniqueName: \"kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb\") pod \"72c01da6-6934-44b4-b9d5-1668557d9f8e\" (UID: \"72c01da6-6934-44b4-b9d5-1668557d9f8e\") " Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.869299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities" (OuterVolumeSpecName: "utilities") pod "72c01da6-6934-44b4-b9d5-1668557d9f8e" (UID: "72c01da6-6934-44b4-b9d5-1668557d9f8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.884431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb" (OuterVolumeSpecName: "kube-api-access-xnpwb") pod "72c01da6-6934-44b4-b9d5-1668557d9f8e" (UID: "72c01da6-6934-44b4-b9d5-1668557d9f8e"). InnerVolumeSpecName "kube-api-access-xnpwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.944025 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72c01da6-6934-44b4-b9d5-1668557d9f8e" (UID: "72c01da6-6934-44b4-b9d5-1668557d9f8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.968980 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.969023 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnpwb\" (UniqueName: \"kubernetes.io/projected/72c01da6-6934-44b4-b9d5-1668557d9f8e-kube-api-access-xnpwb\") on node \"crc\" DevicePath \"\"" Mar 08 05:55:09 crc kubenswrapper[4717]: I0308 05:55:09.969037 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72c01da6-6934-44b4-b9d5-1668557d9f8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.329202 4717 generic.go:334] "Generic (PLEG): container finished" podID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerID="0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818" exitCode=0 Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.329275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerDied","Data":"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818"} Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.329321 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcz2z" event={"ID":"72c01da6-6934-44b4-b9d5-1668557d9f8e","Type":"ContainerDied","Data":"ebee16692d69671f84c65fd903b69aeb659ad763fa4e6db286e6078f9993ce31"} Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.329372 4717 scope.go:117] "RemoveContainer" containerID="0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.329711 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcz2z" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.379178 4717 scope.go:117] "RemoveContainer" containerID="af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.389897 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.414739 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcz2z"] Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.418028 4717 scope.go:117] "RemoveContainer" containerID="56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.485063 4717 scope.go:117] "RemoveContainer" containerID="0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818" Mar 08 05:55:10 crc kubenswrapper[4717]: E0308 05:55:10.485633 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818\": container with ID starting with 0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818 not found: ID does not exist" containerID="0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.485705 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818"} err="failed to get container status \"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818\": rpc error: code = NotFound desc = could not find container \"0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818\": container with ID starting with 0a4ef3839e14b79ca8fa8f83ed6a53cc54e2af2ca5cadd5d4787451644965818 not found: ID does not exist" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.485766 4717 scope.go:117] "RemoveContainer" containerID="af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162" Mar 08 05:55:10 crc kubenswrapper[4717]: E0308 05:55:10.486155 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162\": container with ID starting with af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162 not found: ID does not exist" containerID="af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.486208 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162"} err="failed to get container status \"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162\": rpc error: code = NotFound desc = could not find container \"af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162\": container with ID starting with af0068235e20294b6b6dbd977b3ab208a419a9035eef0c8335c96923bf8c5162 not found: ID does not exist" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.486241 4717 scope.go:117] "RemoveContainer" containerID="56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8" Mar 08 05:55:10 crc kubenswrapper[4717]: E0308 05:55:10.486865 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8\": container with ID starting with 56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8 not found: ID does not exist" containerID="56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8" Mar 08 05:55:10 crc kubenswrapper[4717]: I0308 05:55:10.486920 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8"} err="failed to get container status \"56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8\": rpc error: code = NotFound desc = could not find container \"56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8\": container with ID starting with 56759c6adb86e03a32c70e8e2d892fe99be5c3001a8442704cdad404b522c2b8 not found: ID does not exist" Mar 08 05:55:11 crc kubenswrapper[4717]: I0308 05:55:11.800371 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" path="/var/lib/kubelet/pods/72c01da6-6934-44b4-b9d5-1668557d9f8e/volumes" Mar 08 05:55:18 crc kubenswrapper[4717]: I0308 05:55:18.783532 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:55:18 crc kubenswrapper[4717]: E0308 05:55:18.784845 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:55:31 crc kubenswrapper[4717]: I0308 05:55:31.782716 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:55:31 crc kubenswrapper[4717]: E0308 05:55:31.783981 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:55:33 crc kubenswrapper[4717]: I0308 05:55:33.343280 4717 scope.go:117] "RemoveContainer" containerID="61ce5c0deb4b32b6586eb00f56b8e91449469492b9cba140c467eadd3a213697" Mar 08 05:55:43 crc kubenswrapper[4717]: I0308 05:55:43.795147 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:55:43 crc kubenswrapper[4717]: E0308 05:55:43.796071 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:55:56 crc kubenswrapper[4717]: I0308 05:55:56.782246 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:55:56 crc kubenswrapper[4717]: E0308 05:55:56.783178 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.150206 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549156-xd45c"] Mar 08 05:56:00 crc kubenswrapper[4717]: E0308 05:56:00.151294 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="extract-content" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.151323 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="extract-content" Mar 08 05:56:00 crc kubenswrapper[4717]: E0308 05:56:00.151379 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="registry-server" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.151392 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="registry-server" Mar 08 05:56:00 crc kubenswrapper[4717]: E0308 05:56:00.151423 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="extract-utilities" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.151446 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="extract-utilities" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.151916 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c01da6-6934-44b4-b9d5-1668557d9f8e" containerName="registry-server" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.153327 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.155548 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.155602 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.157071 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.168507 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549156-xd45c"] Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.270924 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6b5v\" (UniqueName: \"kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v\") pod \"auto-csr-approver-29549156-xd45c\" (UID: \"36a115a6-c45f-4247-baa4-97af8465ac5c\") " pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.372803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6b5v\" (UniqueName: \"kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v\") pod \"auto-csr-approver-29549156-xd45c\" (UID: \"36a115a6-c45f-4247-baa4-97af8465ac5c\") " pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.394315 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6b5v\" (UniqueName: \"kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v\") pod \"auto-csr-approver-29549156-xd45c\" (UID: \"36a115a6-c45f-4247-baa4-97af8465ac5c\") " pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.481639 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:00 crc kubenswrapper[4717]: W0308 05:56:00.958598 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a115a6_c45f_4247_baa4_97af8465ac5c.slice/crio-b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef WatchSource:0}: Error finding container b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef: Status 404 returned error can't find the container with id b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef Mar 08 05:56:00 crc kubenswrapper[4717]: I0308 05:56:00.959245 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549156-xd45c"] Mar 08 05:56:01 crc kubenswrapper[4717]: I0308 05:56:01.979601 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549156-xd45c" event={"ID":"36a115a6-c45f-4247-baa4-97af8465ac5c","Type":"ContainerStarted","Data":"b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef"} Mar 08 05:56:03 crc kubenswrapper[4717]: I0308 05:56:03.012583 4717 generic.go:334] "Generic (PLEG): container finished" podID="36a115a6-c45f-4247-baa4-97af8465ac5c" containerID="c3461660eaef6e37145da764f83cacc73e0f6235dba2451c2fd8bdb99819e97f" exitCode=0 Mar 08 05:56:03 crc kubenswrapper[4717]: I0308 05:56:03.012728 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549156-xd45c" event={"ID":"36a115a6-c45f-4247-baa4-97af8465ac5c","Type":"ContainerDied","Data":"c3461660eaef6e37145da764f83cacc73e0f6235dba2451c2fd8bdb99819e97f"} Mar 08 05:56:04 crc kubenswrapper[4717]: I0308 05:56:04.401675 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:04 crc kubenswrapper[4717]: I0308 05:56:04.560609 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6b5v\" (UniqueName: \"kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v\") pod \"36a115a6-c45f-4247-baa4-97af8465ac5c\" (UID: \"36a115a6-c45f-4247-baa4-97af8465ac5c\") " Mar 08 05:56:04 crc kubenswrapper[4717]: I0308 05:56:04.575086 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v" (OuterVolumeSpecName: "kube-api-access-p6b5v") pod "36a115a6-c45f-4247-baa4-97af8465ac5c" (UID: "36a115a6-c45f-4247-baa4-97af8465ac5c"). InnerVolumeSpecName "kube-api-access-p6b5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:56:04 crc kubenswrapper[4717]: I0308 05:56:04.663737 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6b5v\" (UniqueName: \"kubernetes.io/projected/36a115a6-c45f-4247-baa4-97af8465ac5c-kube-api-access-p6b5v\") on node \"crc\" DevicePath \"\"" Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.038031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549156-xd45c" event={"ID":"36a115a6-c45f-4247-baa4-97af8465ac5c","Type":"ContainerDied","Data":"b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef"} Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.038085 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0c16dab7dfe15bd3c360e58c5f36b73a6365ebd4ee9d88902856f188be76aef" Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.038117 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549156-xd45c" Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.489653 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549150-rddpb"] Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.501937 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549150-rddpb"] Mar 08 05:56:05 crc kubenswrapper[4717]: I0308 05:56:05.797777 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7cd813-172e-475f-aef7-609e9932b290" path="/var/lib/kubelet/pods/cf7cd813-172e-475f-aef7-609e9932b290/volumes" Mar 08 05:56:10 crc kubenswrapper[4717]: I0308 05:56:10.102343 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8682143-56c7-442e-987a-d9da77fbe879" containerID="8b8eac8a51c9e32ada85d8fe5db3c7e04fe057777d5dd0bc0d3c1b28ce67812f" exitCode=0 Mar 08 05:56:10 crc kubenswrapper[4717]: I0308 05:56:10.102456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" event={"ID":"d8682143-56c7-442e-987a-d9da77fbe879","Type":"ContainerDied","Data":"8b8eac8a51c9e32ada85d8fe5db3c7e04fe057777d5dd0bc0d3c1b28ce67812f"} Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.686223 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.718815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nvgl\" (UniqueName: \"kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl\") pod \"d8682143-56c7-442e-987a-d9da77fbe879\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.718986 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle\") pod \"d8682143-56c7-442e-987a-d9da77fbe879\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.719025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory\") pod \"d8682143-56c7-442e-987a-d9da77fbe879\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.719226 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam\") pod \"d8682143-56c7-442e-987a-d9da77fbe879\" (UID: \"d8682143-56c7-442e-987a-d9da77fbe879\") " Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.725436 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d8682143-56c7-442e-987a-d9da77fbe879" (UID: "d8682143-56c7-442e-987a-d9da77fbe879"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.726876 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl" (OuterVolumeSpecName: "kube-api-access-4nvgl") pod "d8682143-56c7-442e-987a-d9da77fbe879" (UID: "d8682143-56c7-442e-987a-d9da77fbe879"). InnerVolumeSpecName "kube-api-access-4nvgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.751764 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d8682143-56c7-442e-987a-d9da77fbe879" (UID: "d8682143-56c7-442e-987a-d9da77fbe879"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.754165 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory" (OuterVolumeSpecName: "inventory") pod "d8682143-56c7-442e-987a-d9da77fbe879" (UID: "d8682143-56c7-442e-987a-d9da77fbe879"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.782383 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:56:11 crc kubenswrapper[4717]: E0308 05:56:11.782883 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.822510 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nvgl\" (UniqueName: \"kubernetes.io/projected/d8682143-56c7-442e-987a-d9da77fbe879-kube-api-access-4nvgl\") on node \"crc\" DevicePath \"\"" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.822537 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.822547 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:56:11 crc kubenswrapper[4717]: I0308 05:56:11.822555 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8682143-56c7-442e-987a-d9da77fbe879-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.127901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" event={"ID":"d8682143-56c7-442e-987a-d9da77fbe879","Type":"ContainerDied","Data":"23e9c0e166f12aaf13e078249d655a5019d7073d863e35519a9fb2caadd4277e"} Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.128298 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e9c0e166f12aaf13e078249d655a5019d7073d863e35519a9fb2caadd4277e" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.128022 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.245762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c"] Mar 08 05:56:12 crc kubenswrapper[4717]: E0308 05:56:12.246258 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a115a6-c45f-4247-baa4-97af8465ac5c" containerName="oc" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.246284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a115a6-c45f-4247-baa4-97af8465ac5c" containerName="oc" Mar 08 05:56:12 crc kubenswrapper[4717]: E0308 05:56:12.246338 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8682143-56c7-442e-987a-d9da77fbe879" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.246351 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8682143-56c7-442e-987a-d9da77fbe879" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.246662 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a115a6-c45f-4247-baa4-97af8465ac5c" containerName="oc" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.246726 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8682143-56c7-442e-987a-d9da77fbe879" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.247626 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.250526 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.251027 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.251344 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.252167 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.281814 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c"] Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.333843 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.333908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lj4\" (UniqueName: \"kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.333935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.436926 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.437054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lj4\" (UniqueName: \"kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.437109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.444249 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.444803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.467197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lj4\" (UniqueName: \"kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:12 crc kubenswrapper[4717]: I0308 05:56:12.575581 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:56:13 crc kubenswrapper[4717]: W0308 05:56:13.121416 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099dec1f_b123_4da0_a81f_52ee1b27d5df.slice/crio-128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672 WatchSource:0}: Error finding container 128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672: Status 404 returned error can't find the container with id 128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672 Mar 08 05:56:13 crc kubenswrapper[4717]: I0308 05:56:13.132449 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c"] Mar 08 05:56:13 crc kubenswrapper[4717]: I0308 05:56:13.145866 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" event={"ID":"099dec1f-b123-4da0-a81f-52ee1b27d5df","Type":"ContainerStarted","Data":"128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672"} Mar 08 05:56:14 crc kubenswrapper[4717]: I0308 05:56:14.164305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" event={"ID":"099dec1f-b123-4da0-a81f-52ee1b27d5df","Type":"ContainerStarted","Data":"930f0bda0aeea488bff4fc7e4a1585ad5e222400cfacfe1d3c7eba6235c69176"} Mar 08 05:56:14 crc kubenswrapper[4717]: I0308 05:56:14.189412 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" podStartSLOduration=1.6781639080000001 podStartE2EDuration="2.189393434s" podCreationTimestamp="2026-03-08 05:56:12 +0000 UTC" firstStartedPulling="2026-03-08 05:56:13.12412384 +0000 UTC m=+1800.041772684" lastFinishedPulling="2026-03-08 05:56:13.635353356 +0000 UTC m=+1800.553002210" observedRunningTime="2026-03-08 05:56:14.18472839 +0000 UTC m=+1801.102377274" watchObservedRunningTime="2026-03-08 05:56:14.189393434 +0000 UTC m=+1801.107042288" Mar 08 05:56:25 crc kubenswrapper[4717]: I0308 05:56:25.782588 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:56:25 crc kubenswrapper[4717]: E0308 05:56:25.783814 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:56:33 crc kubenswrapper[4717]: I0308 05:56:33.436234 4717 scope.go:117] "RemoveContainer" containerID="64e508a9372bb09e04233f401018389b5d43f61eda7d881c7626436d11a2d294" Mar 08 05:56:36 crc kubenswrapper[4717]: I0308 05:56:36.781838 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:56:36 crc kubenswrapper[4717]: E0308 05:56:36.782837 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:56:50 crc kubenswrapper[4717]: I0308 05:56:50.782590 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:56:50 crc kubenswrapper[4717]: E0308 05:56:50.783771 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:56:53 crc kubenswrapper[4717]: I0308 05:56:53.064289 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-z92m6"] Mar 08 05:56:53 crc kubenswrapper[4717]: I0308 05:56:53.086329 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-z92m6"] Mar 08 05:56:53 crc kubenswrapper[4717]: I0308 05:56:53.806486 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23475f99-47f9-4533-9a3d-c8a024f6bfdb" path="/var/lib/kubelet/pods/23475f99-47f9-4533-9a3d-c8a024f6bfdb/volumes" Mar 08 05:56:54 crc kubenswrapper[4717]: I0308 05:56:54.053075 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-efc6-account-create-update-79h54"] Mar 08 05:56:54 crc kubenswrapper[4717]: I0308 05:56:54.064287 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cc34-account-create-update-kcgjs"] Mar 08 05:56:54 crc kubenswrapper[4717]: I0308 05:56:54.072789 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-efc6-account-create-update-79h54"] Mar 08 05:56:54 crc kubenswrapper[4717]: I0308 05:56:54.083931 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cc34-account-create-update-kcgjs"] Mar 08 05:56:55 crc kubenswrapper[4717]: I0308 05:56:55.796806 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ef4c0a-9284-45bb-9319-21b75e4e1327" path="/var/lib/kubelet/pods/22ef4c0a-9284-45bb-9319-21b75e4e1327/volumes" Mar 08 05:56:55 crc kubenswrapper[4717]: I0308 05:56:55.799065 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e74691e-98a9-4c29-9c1f-8ce4e5788f35" path="/var/lib/kubelet/pods/9e74691e-98a9-4c29-9c1f-8ce4e5788f35/volumes" Mar 08 05:56:56 crc kubenswrapper[4717]: I0308 05:56:56.045673 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-4wvld"] Mar 08 05:56:56 crc kubenswrapper[4717]: I0308 05:56:56.065526 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-4wvld"] Mar 08 05:56:57 crc kubenswrapper[4717]: I0308 05:56:57.038969 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-ee16-account-create-update-nz4f6"] Mar 08 05:56:57 crc kubenswrapper[4717]: I0308 05:56:57.053254 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-ee16-account-create-update-nz4f6"] Mar 08 05:56:57 crc kubenswrapper[4717]: I0308 05:56:57.801439 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe0279d-2b57-4bf8-bf59-57801e535420" path="/var/lib/kubelet/pods/bfe0279d-2b57-4bf8-bf59-57801e535420/volumes" Mar 08 05:56:57 crc kubenswrapper[4717]: I0308 05:56:57.803317 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3322e2d-06fd-4728-b09c-e03982c5afc0" path="/var/lib/kubelet/pods/e3322e2d-06fd-4728-b09c-e03982c5afc0/volumes" Mar 08 05:57:02 crc kubenswrapper[4717]: I0308 05:57:02.782745 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:57:02 crc kubenswrapper[4717]: E0308 05:57:02.783542 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.052087 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-t4wh5"] Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.063240 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9f40-account-create-update-bhbfw"] Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.071810 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-t4wh5"] Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.080254 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9f40-account-create-update-bhbfw"] Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.805903 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a07c895-406b-426a-b60a-545e0dced812" path="/var/lib/kubelet/pods/2a07c895-406b-426a-b60a-545e0dced812/volumes" Mar 08 05:57:03 crc kubenswrapper[4717]: I0308 05:57:03.807373 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c576697c-a1be-47aa-bf8f-902435b2af04" path="/var/lib/kubelet/pods/c576697c-a1be-47aa-bf8f-902435b2af04/volumes" Mar 08 05:57:04 crc kubenswrapper[4717]: I0308 05:57:04.033196 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-smbl2"] Mar 08 05:57:04 crc kubenswrapper[4717]: I0308 05:57:04.042352 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-smbl2"] Mar 08 05:57:05 crc kubenswrapper[4717]: I0308 05:57:05.803795 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d77060-47d4-40ad-b1c1-5000a1513aaa" path="/var/lib/kubelet/pods/a6d77060-47d4-40ad-b1c1-5000a1513aaa/volumes" Mar 08 05:57:11 crc kubenswrapper[4717]: I0308 05:57:11.028613 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-t5kvr"] Mar 08 05:57:11 crc kubenswrapper[4717]: I0308 05:57:11.038816 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-t5kvr"] Mar 08 05:57:11 crc kubenswrapper[4717]: I0308 05:57:11.811566 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3277c4-fd24-4629-8325-6abed267f270" path="/var/lib/kubelet/pods/bb3277c4-fd24-4629-8325-6abed267f270/volumes" Mar 08 05:57:14 crc kubenswrapper[4717]: I0308 05:57:14.783496 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:57:14 crc kubenswrapper[4717]: E0308 05:57:14.784743 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:57:29 crc kubenswrapper[4717]: I0308 05:57:29.052685 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xvl9h"] Mar 08 05:57:29 crc kubenswrapper[4717]: I0308 05:57:29.065228 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xvl9h"] Mar 08 05:57:29 crc kubenswrapper[4717]: I0308 05:57:29.781362 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:57:29 crc kubenswrapper[4717]: E0308 05:57:29.781720 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:57:29 crc kubenswrapper[4717]: I0308 05:57:29.797976 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8602ca4e-9467-4ad0-a725-c129f29bbedf" path="/var/lib/kubelet/pods/8602ca4e-9467-4ad0-a725-c129f29bbedf/volumes" Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.074797 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1e73-account-create-update-dmhcb"] Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.103579 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wvdfv"] Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.131079 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1e73-account-create-update-dmhcb"] Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.140888 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-217a-account-create-update-pplnn"] Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.147242 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wvdfv"] Mar 08 05:57:30 crc kubenswrapper[4717]: I0308 05:57:30.155830 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-217a-account-create-update-pplnn"] Mar 08 05:57:31 crc kubenswrapper[4717]: I0308 05:57:31.800352 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081ac052-0d8e-40f7-8618-7786fece62b3" path="/var/lib/kubelet/pods/081ac052-0d8e-40f7-8618-7786fece62b3/volumes" Mar 08 05:57:31 crc kubenswrapper[4717]: I0308 05:57:31.801672 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd06b81e-1f5a-4f94-9b81-b24889dc8154" path="/var/lib/kubelet/pods/bd06b81e-1f5a-4f94-9b81-b24889dc8154/volumes" Mar 08 05:57:31 crc kubenswrapper[4717]: I0308 05:57:31.802841 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01a3977-851c-4679-b543-a7f550c8ec53" path="/var/lib/kubelet/pods/c01a3977-851c-4679-b543-a7f550c8ec53/volumes" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.058744 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sr87b"] Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.072302 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a0cf-account-create-update-skd47"] Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.083393 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a0cf-account-create-update-skd47"] Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.093808 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sr87b"] Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.547021 4717 scope.go:117] "RemoveContainer" containerID="88116863db703bb3b571378b6c7e55e0f340f27f38cfb2c370b14f9a8a90c895" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.577031 4717 scope.go:117] "RemoveContainer" containerID="088d3cbf4a68a9371e3d9737a7d7a0e8d8fa4df911be2a27196944665c99232d" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.638329 4717 scope.go:117] "RemoveContainer" containerID="b63971cf580c1772314687faac888cade14410bc46fd71b3900fadc9caf9252f" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.695873 4717 scope.go:117] "RemoveContainer" containerID="255cfec37cbcfaa183ff2967bc16799020f26ac8d36e31e90a93cf71423f99a0" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.727329 4717 scope.go:117] "RemoveContainer" containerID="a408b0cf1826d25536b15f2d3c86a71de3d0646fc1cee3d534dab3694024215a" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.777225 4717 scope.go:117] "RemoveContainer" containerID="008c39892611d17ac1115e163ddbdd097cf110a7d4cc4e20295f38f134bc360f" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.808129 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44abc9b7-3edc-42a0-bcf0-8b07efcec67f" path="/var/lib/kubelet/pods/44abc9b7-3edc-42a0-bcf0-8b07efcec67f/volumes" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.809098 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc72f330-6146-4017-adde-5a63a6cffdb4" path="/var/lib/kubelet/pods/cc72f330-6146-4017-adde-5a63a6cffdb4/volumes" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.838895 4717 scope.go:117] "RemoveContainer" containerID="88a5f3e61553288b62237bd90b5208a54c45283b46c95004d70bece082206615" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.859193 4717 scope.go:117] "RemoveContainer" containerID="1885c81c3e71bf433e55afd291b620b96f295a75aae4aa9b087adb55005bea90" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.875455 4717 scope.go:117] "RemoveContainer" containerID="1c1c7255770cb151222afda0c8d5fe610519c563259e929df10847e4916c8c39" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.893348 4717 scope.go:117] "RemoveContainer" containerID="083eec1c0f86dee19866dc6eaf00858722d74cab29a43e6d88f14412ea08ca92" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.924497 4717 scope.go:117] "RemoveContainer" containerID="d561e3606294bb7c359a0512789d0bc6239139bee9a05e2754a178f549f63624" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.948973 4717 scope.go:117] "RemoveContainer" containerID="38df43df38f0283dc78351ea63e00fded75b4291888b55bd10fe1cf656f105dc" Mar 08 05:57:33 crc kubenswrapper[4717]: I0308 05:57:33.973720 4717 scope.go:117] "RemoveContainer" containerID="ec91b22d592cf245308f4999c233f8e985fcb6a4481a7aa80b7ca073e6e9c605" Mar 08 05:57:34 crc kubenswrapper[4717]: I0308 05:57:34.020750 4717 scope.go:117] "RemoveContainer" containerID="c1c2fd3dc0c770759a543293e06d12b62d1bf3abd53b33e80ff7f1d9a482704b" Mar 08 05:57:34 crc kubenswrapper[4717]: I0308 05:57:34.027705 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-r5tmd"] Mar 08 05:57:34 crc kubenswrapper[4717]: I0308 05:57:34.036361 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-r5tmd"] Mar 08 05:57:35 crc kubenswrapper[4717]: I0308 05:57:35.805258 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536f54f2-24ab-4b5d-a494-77d2464d03f9" path="/var/lib/kubelet/pods/536f54f2-24ab-4b5d-a494-77d2464d03f9/volumes" Mar 08 05:57:40 crc kubenswrapper[4717]: I0308 05:57:40.040610 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-h5xmg"] Mar 08 05:57:40 crc kubenswrapper[4717]: I0308 05:57:40.065836 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-h5xmg"] Mar 08 05:57:41 crc kubenswrapper[4717]: I0308 05:57:41.782725 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:57:41 crc kubenswrapper[4717]: E0308 05:57:41.782995 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:57:41 crc kubenswrapper[4717]: I0308 05:57:41.804938 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da7adfb-c194-4c69-af19-99e2ff00dbfa" path="/var/lib/kubelet/pods/9da7adfb-c194-4c69-af19-99e2ff00dbfa/volumes" Mar 08 05:57:47 crc kubenswrapper[4717]: I0308 05:57:47.038667 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-m97sm"] Mar 08 05:57:47 crc kubenswrapper[4717]: I0308 05:57:47.071676 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-m97sm"] Mar 08 05:57:47 crc kubenswrapper[4717]: I0308 05:57:47.801168 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90593a29-3f8c-4228-8c82-a183a4e33054" path="/var/lib/kubelet/pods/90593a29-3f8c-4228-8c82-a183a4e33054/volumes" Mar 08 05:57:54 crc kubenswrapper[4717]: I0308 05:57:54.782604 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:57:54 crc kubenswrapper[4717]: E0308 05:57:54.784249 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.151205 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549158-c56wc"] Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.153442 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.156849 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.157350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.158616 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.163037 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549158-c56wc"] Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.246431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dt7t\" (UniqueName: \"kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t\") pod \"auto-csr-approver-29549158-c56wc\" (UID: \"e990088a-bb32-4b49-90c2-f0307d160ae2\") " pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.348212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dt7t\" (UniqueName: \"kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t\") pod \"auto-csr-approver-29549158-c56wc\" (UID: \"e990088a-bb32-4b49-90c2-f0307d160ae2\") " pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.368472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dt7t\" (UniqueName: \"kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t\") pod \"auto-csr-approver-29549158-c56wc\" (UID: \"e990088a-bb32-4b49-90c2-f0307d160ae2\") " pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.504753 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.962093 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549158-c56wc"] Mar 08 05:58:00 crc kubenswrapper[4717]: I0308 05:58:00.967755 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 05:58:01 crc kubenswrapper[4717]: I0308 05:58:01.510038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549158-c56wc" event={"ID":"e990088a-bb32-4b49-90c2-f0307d160ae2","Type":"ContainerStarted","Data":"268c33cb793c64890c81ad024246e19d01f33304dcdafe6145d792ed330ea8ca"} Mar 08 05:58:02 crc kubenswrapper[4717]: I0308 05:58:02.519056 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549158-c56wc" event={"ID":"e990088a-bb32-4b49-90c2-f0307d160ae2","Type":"ContainerStarted","Data":"89c1b502f4ec03c921ed3ff7f6d1f3e9fc534bdaf933b86f9e5d2db1a122ed0a"} Mar 08 05:58:02 crc kubenswrapper[4717]: I0308 05:58:02.537184 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549158-c56wc" podStartSLOduration=1.431961874 podStartE2EDuration="2.537162168s" podCreationTimestamp="2026-03-08 05:58:00 +0000 UTC" firstStartedPulling="2026-03-08 05:58:00.967544716 +0000 UTC m=+1907.885193550" lastFinishedPulling="2026-03-08 05:58:02.07274499 +0000 UTC m=+1908.990393844" observedRunningTime="2026-03-08 05:58:02.531363226 +0000 UTC m=+1909.449012090" watchObservedRunningTime="2026-03-08 05:58:02.537162168 +0000 UTC m=+1909.454811002" Mar 08 05:58:03 crc kubenswrapper[4717]: I0308 05:58:03.532162 4717 generic.go:334] "Generic (PLEG): container finished" podID="e990088a-bb32-4b49-90c2-f0307d160ae2" containerID="89c1b502f4ec03c921ed3ff7f6d1f3e9fc534bdaf933b86f9e5d2db1a122ed0a" exitCode=0 Mar 08 05:58:03 crc kubenswrapper[4717]: I0308 05:58:03.532270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549158-c56wc" event={"ID":"e990088a-bb32-4b49-90c2-f0307d160ae2","Type":"ContainerDied","Data":"89c1b502f4ec03c921ed3ff7f6d1f3e9fc534bdaf933b86f9e5d2db1a122ed0a"} Mar 08 05:58:04 crc kubenswrapper[4717]: I0308 05:58:04.904913 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.037051 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dt7t\" (UniqueName: \"kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t\") pod \"e990088a-bb32-4b49-90c2-f0307d160ae2\" (UID: \"e990088a-bb32-4b49-90c2-f0307d160ae2\") " Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.042875 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t" (OuterVolumeSpecName: "kube-api-access-5dt7t") pod "e990088a-bb32-4b49-90c2-f0307d160ae2" (UID: "e990088a-bb32-4b49-90c2-f0307d160ae2"). InnerVolumeSpecName "kube-api-access-5dt7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.139909 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dt7t\" (UniqueName: \"kubernetes.io/projected/e990088a-bb32-4b49-90c2-f0307d160ae2-kube-api-access-5dt7t\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.555390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549158-c56wc" event={"ID":"e990088a-bb32-4b49-90c2-f0307d160ae2","Type":"ContainerDied","Data":"268c33cb793c64890c81ad024246e19d01f33304dcdafe6145d792ed330ea8ca"} Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.555829 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268c33cb793c64890c81ad024246e19d01f33304dcdafe6145d792ed330ea8ca" Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.555451 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549158-c56wc" Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.629583 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549152-l8zn2"] Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.640563 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549152-l8zn2"] Mar 08 05:58:05 crc kubenswrapper[4717]: I0308 05:58:05.797855 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="431d1972-afa5-46c3-97f4-2de7cd1a578a" path="/var/lib/kubelet/pods/431d1972-afa5-46c3-97f4-2de7cd1a578a/volumes" Mar 08 05:58:06 crc kubenswrapper[4717]: I0308 05:58:06.782055 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 05:58:07 crc kubenswrapper[4717]: I0308 05:58:07.577248 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da"} Mar 08 05:58:14 crc kubenswrapper[4717]: I0308 05:58:14.047837 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-htmd2"] Mar 08 05:58:14 crc kubenswrapper[4717]: I0308 05:58:14.058394 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-htmd2"] Mar 08 05:58:15 crc kubenswrapper[4717]: I0308 05:58:15.833017 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818627ad-f6eb-43d2-adfc-7daacc7f9b6f" path="/var/lib/kubelet/pods/818627ad-f6eb-43d2-adfc-7daacc7f9b6f/volumes" Mar 08 05:58:16 crc kubenswrapper[4717]: I0308 05:58:16.046480 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wvxkm"] Mar 08 05:58:16 crc kubenswrapper[4717]: I0308 05:58:16.057885 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wvxkm"] Mar 08 05:58:17 crc kubenswrapper[4717]: I0308 05:58:17.806370 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc84338-48ac-4538-b134-5993d5a9f91c" path="/var/lib/kubelet/pods/ffc84338-48ac-4538-b134-5993d5a9f91c/volumes" Mar 08 05:58:18 crc kubenswrapper[4717]: I0308 05:58:18.712372 4717 generic.go:334] "Generic (PLEG): container finished" podID="099dec1f-b123-4da0-a81f-52ee1b27d5df" containerID="930f0bda0aeea488bff4fc7e4a1585ad5e222400cfacfe1d3c7eba6235c69176" exitCode=0 Mar 08 05:58:18 crc kubenswrapper[4717]: I0308 05:58:18.712469 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" event={"ID":"099dec1f-b123-4da0-a81f-52ee1b27d5df","Type":"ContainerDied","Data":"930f0bda0aeea488bff4fc7e4a1585ad5e222400cfacfe1d3c7eba6235c69176"} Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.305432 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.443111 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory\") pod \"099dec1f-b123-4da0-a81f-52ee1b27d5df\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.443516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5lj4\" (UniqueName: \"kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4\") pod \"099dec1f-b123-4da0-a81f-52ee1b27d5df\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.443668 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam\") pod \"099dec1f-b123-4da0-a81f-52ee1b27d5df\" (UID: \"099dec1f-b123-4da0-a81f-52ee1b27d5df\") " Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.458082 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4" (OuterVolumeSpecName: "kube-api-access-w5lj4") pod "099dec1f-b123-4da0-a81f-52ee1b27d5df" (UID: "099dec1f-b123-4da0-a81f-52ee1b27d5df"). InnerVolumeSpecName "kube-api-access-w5lj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.499926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory" (OuterVolumeSpecName: "inventory") pod "099dec1f-b123-4da0-a81f-52ee1b27d5df" (UID: "099dec1f-b123-4da0-a81f-52ee1b27d5df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.514112 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "099dec1f-b123-4da0-a81f-52ee1b27d5df" (UID: "099dec1f-b123-4da0-a81f-52ee1b27d5df"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.546025 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.546081 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5lj4\" (UniqueName: \"kubernetes.io/projected/099dec1f-b123-4da0-a81f-52ee1b27d5df-kube-api-access-w5lj4\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.546105 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/099dec1f-b123-4da0-a81f-52ee1b27d5df-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.557499 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:20 crc kubenswrapper[4717]: E0308 05:58:20.558453 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099dec1f-b123-4da0-a81f-52ee1b27d5df" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.558486 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="099dec1f-b123-4da0-a81f-52ee1b27d5df" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 08 05:58:20 crc kubenswrapper[4717]: E0308 05:58:20.558569 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e990088a-bb32-4b49-90c2-f0307d160ae2" containerName="oc" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.558588 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e990088a-bb32-4b49-90c2-f0307d160ae2" containerName="oc" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.559150 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e990088a-bb32-4b49-90c2-f0307d160ae2" containerName="oc" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.559207 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="099dec1f-b123-4da0-a81f-52ee1b27d5df" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.569061 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.573304 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.649131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvff\" (UniqueName: \"kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.649399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.649570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.737629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" event={"ID":"099dec1f-b123-4da0-a81f-52ee1b27d5df","Type":"ContainerDied","Data":"128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672"} Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.737795 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="128e9d079f055113b4c3ef41e8d4fd9f7eef09d374a8a8bec9ea372e0158c672" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.737708 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.750985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.751106 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvff\" (UniqueName: \"kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.751196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.751462 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.751595 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.767284 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvff\" (UniqueName: \"kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff\") pod \"redhat-marketplace-jpmb6\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.861669 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm"] Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.862960 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.865199 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.865641 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.865773 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.865895 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.869878 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm"] Mar 08 05:58:20 crc kubenswrapper[4717]: I0308 05:58:20.898411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.055174 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgw8\" (UniqueName: \"kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.055622 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.055651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.158139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.158195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.158297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgw8\" (UniqueName: \"kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.162922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.162940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.174363 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgw8\" (UniqueName: \"kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.186416 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.388850 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:21 crc kubenswrapper[4717]: W0308 05:58:21.717736 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbffff61_9614_4594_b52e_be489d2b2f22.slice/crio-5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9 WatchSource:0}: Error finding container 5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9: Status 404 returned error can't find the container with id 5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9 Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.727803 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm"] Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.751406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" event={"ID":"fbffff61-9614-4594-b52e-be489d2b2f22","Type":"ContainerStarted","Data":"5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9"} Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.753789 4717 generic.go:334] "Generic (PLEG): container finished" podID="16141190-3d2b-4f32-9249-75104b5afdfa" containerID="3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b" exitCode=0 Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.753851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerDied","Data":"3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b"} Mar 08 05:58:21 crc kubenswrapper[4717]: I0308 05:58:21.753896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerStarted","Data":"afccd328c186d1966dfaa0fcf2cb5d7d974724f48459be14f5ab6c4250f52586"} Mar 08 05:58:22 crc kubenswrapper[4717]: I0308 05:58:22.770820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" event={"ID":"fbffff61-9614-4594-b52e-be489d2b2f22","Type":"ContainerStarted","Data":"e997382420839e29afde49fb758bd745c92b8769fa4d56f668d4b406a2861f97"} Mar 08 05:58:22 crc kubenswrapper[4717]: I0308 05:58:22.800310 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" podStartSLOduration=2.280718674 podStartE2EDuration="2.800288759s" podCreationTimestamp="2026-03-08 05:58:20 +0000 UTC" firstStartedPulling="2026-03-08 05:58:21.721451713 +0000 UTC m=+1928.639100607" lastFinishedPulling="2026-03-08 05:58:22.241021808 +0000 UTC m=+1929.158670692" observedRunningTime="2026-03-08 05:58:22.78812353 +0000 UTC m=+1929.705772384" watchObservedRunningTime="2026-03-08 05:58:22.800288759 +0000 UTC m=+1929.717937613" Mar 08 05:58:23 crc kubenswrapper[4717]: I0308 05:58:23.065884 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j9lc2"] Mar 08 05:58:23 crc kubenswrapper[4717]: I0308 05:58:23.082531 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j9lc2"] Mar 08 05:58:23 crc kubenswrapper[4717]: I0308 05:58:23.790731 4717 generic.go:334] "Generic (PLEG): container finished" podID="16141190-3d2b-4f32-9249-75104b5afdfa" containerID="88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425" exitCode=0 Mar 08 05:58:23 crc kubenswrapper[4717]: I0308 05:58:23.806871 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac4bbfc-288a-451e-8f03-864b4b2cb96e" path="/var/lib/kubelet/pods/0ac4bbfc-288a-451e-8f03-864b4b2cb96e/volumes" Mar 08 05:58:23 crc kubenswrapper[4717]: I0308 05:58:23.809637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerDied","Data":"88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425"} Mar 08 05:58:24 crc kubenswrapper[4717]: I0308 05:58:24.805647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerStarted","Data":"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657"} Mar 08 05:58:24 crc kubenswrapper[4717]: I0308 05:58:24.835246 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jpmb6" podStartSLOduration=2.381502442 podStartE2EDuration="4.835225203s" podCreationTimestamp="2026-03-08 05:58:20 +0000 UTC" firstStartedPulling="2026-03-08 05:58:21.755399487 +0000 UTC m=+1928.673048341" lastFinishedPulling="2026-03-08 05:58:24.209122218 +0000 UTC m=+1931.126771102" observedRunningTime="2026-03-08 05:58:24.828487847 +0000 UTC m=+1931.746136701" watchObservedRunningTime="2026-03-08 05:58:24.835225203 +0000 UTC m=+1931.752874047" Mar 08 05:58:30 crc kubenswrapper[4717]: I0308 05:58:30.899833 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:30 crc kubenswrapper[4717]: I0308 05:58:30.900513 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:30 crc kubenswrapper[4717]: I0308 05:58:30.970212 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:31 crc kubenswrapper[4717]: I0308 05:58:31.946755 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:32 crc kubenswrapper[4717]: I0308 05:58:32.023453 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:33 crc kubenswrapper[4717]: I0308 05:58:33.914908 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jpmb6" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="registry-server" containerID="cri-o://db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657" gracePeriod=2 Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.035462 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rlvrs"] Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.050980 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rlvrs"] Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.323897 4717 scope.go:117] "RemoveContainer" containerID="8f8d69e9d6cc8b05c2041743600d3eaa85202c84e62d77a0f17afb206daa3902" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.398483 4717 scope.go:117] "RemoveContainer" containerID="638fb144780a760acef193ddb2040a3dffb74bb1fd6d741e6aa474c94a8f4ac6" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.407553 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.424560 4717 scope.go:117] "RemoveContainer" containerID="207480c3692c1d3bfed26d5c8ae0a4b4e7e835919d7acf36b2e52514805ce2eb" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.510435 4717 scope.go:117] "RemoveContainer" containerID="ed51d405b2bc45be9abcfa817172cb6ff7c1fd2dcc3a0ada30ccd6707e23e39b" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.552481 4717 scope.go:117] "RemoveContainer" containerID="35d68b7105c54ff4a19e15560804238b3d22a51fa077f704faf2030cb4a39663" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.555654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content\") pod \"16141190-3d2b-4f32-9249-75104b5afdfa\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.555784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities\") pod \"16141190-3d2b-4f32-9249-75104b5afdfa\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.555870 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvff\" (UniqueName: \"kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff\") pod \"16141190-3d2b-4f32-9249-75104b5afdfa\" (UID: \"16141190-3d2b-4f32-9249-75104b5afdfa\") " Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.556623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities" (OuterVolumeSpecName: "utilities") pod "16141190-3d2b-4f32-9249-75104b5afdfa" (UID: "16141190-3d2b-4f32-9249-75104b5afdfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.565629 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff" (OuterVolumeSpecName: "kube-api-access-cjvff") pod "16141190-3d2b-4f32-9249-75104b5afdfa" (UID: "16141190-3d2b-4f32-9249-75104b5afdfa"). InnerVolumeSpecName "kube-api-access-cjvff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.591615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16141190-3d2b-4f32-9249-75104b5afdfa" (UID: "16141190-3d2b-4f32-9249-75104b5afdfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.597759 4717 scope.go:117] "RemoveContainer" containerID="3943fe4796494594cdb3420ef1d779da4c4fb9883a27fe3efae364b447849b8f" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.643006 4717 scope.go:117] "RemoveContainer" containerID="2ecc3c5b09cedad03e8d0a14eaa8eaaf53a5377ab3d674d50a3decf8ef90d3b1" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.659020 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.659068 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16141190-3d2b-4f32-9249-75104b5afdfa-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.659089 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvff\" (UniqueName: \"kubernetes.io/projected/16141190-3d2b-4f32-9249-75104b5afdfa-kube-api-access-cjvff\") on node \"crc\" DevicePath \"\"" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.685265 4717 scope.go:117] "RemoveContainer" containerID="ee462ee56f5c19d14d4d900eff41855441d1b938f8749802a8f55fb28288db38" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.937433 4717 generic.go:334] "Generic (PLEG): container finished" podID="16141190-3d2b-4f32-9249-75104b5afdfa" containerID="db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657" exitCode=0 Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.937480 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerDied","Data":"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657"} Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.937522 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpmb6" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.937542 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpmb6" event={"ID":"16141190-3d2b-4f32-9249-75104b5afdfa","Type":"ContainerDied","Data":"afccd328c186d1966dfaa0fcf2cb5d7d974724f48459be14f5ab6c4250f52586"} Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.937565 4717 scope.go:117] "RemoveContainer" containerID="db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657" Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.985336 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:34 crc kubenswrapper[4717]: I0308 05:58:34.986263 4717 scope.go:117] "RemoveContainer" containerID="88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.002231 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpmb6"] Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.014032 4717 scope.go:117] "RemoveContainer" containerID="3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.050898 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-66q5h"] Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.059187 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-66q5h"] Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.098785 4717 scope.go:117] "RemoveContainer" containerID="db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657" Mar 08 05:58:35 crc kubenswrapper[4717]: E0308 05:58:35.099415 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657\": container with ID starting with db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657 not found: ID does not exist" containerID="db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.099487 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657"} err="failed to get container status \"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657\": rpc error: code = NotFound desc = could not find container \"db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657\": container with ID starting with db3ec0fc8021e525a7b849a94cdb8e639a07626f18296d44133434aae1952657 not found: ID does not exist" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.099527 4717 scope.go:117] "RemoveContainer" containerID="88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425" Mar 08 05:58:35 crc kubenswrapper[4717]: E0308 05:58:35.100248 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425\": container with ID starting with 88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425 not found: ID does not exist" containerID="88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.100273 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425"} err="failed to get container status \"88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425\": rpc error: code = NotFound desc = could not find container \"88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425\": container with ID starting with 88b6cc83d26eb8417200a9343ce186b810eddf441541165db51abb0e06ae9425 not found: ID does not exist" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.100339 4717 scope.go:117] "RemoveContainer" containerID="3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b" Mar 08 05:58:35 crc kubenswrapper[4717]: E0308 05:58:35.100782 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b\": container with ID starting with 3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b not found: ID does not exist" containerID="3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.100818 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b"} err="failed to get container status \"3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b\": rpc error: code = NotFound desc = could not find container \"3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b\": container with ID starting with 3a2eea3bef24a6ad1921b62d4117a367fe8d2e0a5e0c482a4e10e5bed474a71b not found: ID does not exist" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.798485 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" path="/var/lib/kubelet/pods/16141190-3d2b-4f32-9249-75104b5afdfa/volumes" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.800405 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c734bf7-1916-4a47-93e0-42caaaced812" path="/var/lib/kubelet/pods/6c734bf7-1916-4a47-93e0-42caaaced812/volumes" Mar 08 05:58:35 crc kubenswrapper[4717]: I0308 05:58:35.801773 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6c6686-44c7-49ec-950b-7054d96e207d" path="/var/lib/kubelet/pods/ec6c6686-44c7-49ec-950b-7054d96e207d/volumes" Mar 08 05:59:18 crc kubenswrapper[4717]: I0308 05:59:18.073934 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lwr4v"] Mar 08 05:59:18 crc kubenswrapper[4717]: I0308 05:59:18.086578 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lwr4v"] Mar 08 05:59:19 crc kubenswrapper[4717]: I0308 05:59:19.803333 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ec8299-e288-40a1-882a-0980ef3b21d4" path="/var/lib/kubelet/pods/63ec8299-e288-40a1-882a-0980ef3b21d4/volumes" Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.053655 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wwbxz"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.066266 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jpsxr"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.075868 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30a0-account-create-update-7xzkx"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.085995 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d06f-account-create-update-8vf24"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.094085 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wwbxz"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.100911 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-30a0-account-create-update-7xzkx"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.107598 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ef6d-account-create-update-zrg6l"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.113953 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jpsxr"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.120076 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d06f-account-create-update-8vf24"] Mar 08 05:59:20 crc kubenswrapper[4717]: I0308 05:59:20.126183 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ef6d-account-create-update-zrg6l"] Mar 08 05:59:21 crc kubenswrapper[4717]: I0308 05:59:21.811594 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d25b2af-6a1d-4145-8627-5ba8338bcbef" path="/var/lib/kubelet/pods/2d25b2af-6a1d-4145-8627-5ba8338bcbef/volumes" Mar 08 05:59:21 crc kubenswrapper[4717]: I0308 05:59:21.813436 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="413d3437-74ee-4793-9088-77fac53e4d7c" path="/var/lib/kubelet/pods/413d3437-74ee-4793-9088-77fac53e4d7c/volumes" Mar 08 05:59:21 crc kubenswrapper[4717]: I0308 05:59:21.814797 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ffbca0-f2a1-4c6b-8594-996db23783f2" path="/var/lib/kubelet/pods/46ffbca0-f2a1-4c6b-8594-996db23783f2/volumes" Mar 08 05:59:21 crc kubenswrapper[4717]: I0308 05:59:21.815936 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75976964-499c-4d15-937e-4921f1b16150" path="/var/lib/kubelet/pods/75976964-499c-4d15-937e-4921f1b16150/volumes" Mar 08 05:59:21 crc kubenswrapper[4717]: I0308 05:59:21.819070 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5702b49-8d23-42f8-a162-6783a3eb4a53" path="/var/lib/kubelet/pods/e5702b49-8d23-42f8-a162-6783a3eb4a53/volumes" Mar 08 05:59:34 crc kubenswrapper[4717]: I0308 05:59:34.707473 4717 generic.go:334] "Generic (PLEG): container finished" podID="fbffff61-9614-4594-b52e-be489d2b2f22" containerID="e997382420839e29afde49fb758bd745c92b8769fa4d56f668d4b406a2861f97" exitCode=0 Mar 08 05:59:34 crc kubenswrapper[4717]: I0308 05:59:34.707551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" event={"ID":"fbffff61-9614-4594-b52e-be489d2b2f22","Type":"ContainerDied","Data":"e997382420839e29afde49fb758bd745c92b8769fa4d56f668d4b406a2861f97"} Mar 08 05:59:34 crc kubenswrapper[4717]: I0308 05:59:34.984175 4717 scope.go:117] "RemoveContainer" containerID="f721adaa689443e873da9f195a8ac787b708bb62e59ebc9429a0b488f16dec95" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.012001 4717 scope.go:117] "RemoveContainer" containerID="7e11b405b280a0e3dc60a90053e04e0d057e56cf6f6094e135d9311fe4f25982" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.071332 4717 scope.go:117] "RemoveContainer" containerID="a41a9e9098882e4e7f1fe94978bdef902eefc12293c186376c6466141d8b5ab8" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.115793 4717 scope.go:117] "RemoveContainer" containerID="18228d80281d2a60ffcf4c56d401c6cb7b582c8dabab1a6352f8c7e35f831ade" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.184796 4717 scope.go:117] "RemoveContainer" containerID="6d863e6acd09b337b0c38622de8d4fcc696d4705ecc9d106fceb5a7e9490d0c4" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.255662 4717 scope.go:117] "RemoveContainer" containerID="8a0cb418f30515d9cca8f3fc5b3c8e605b366bb1382b7d5002e81b803047f17e" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.307645 4717 scope.go:117] "RemoveContainer" containerID="23e1e15c00e1ecd322c7ffaa7bb7932a2bd02737f81ba3236f83f3d8be2efd7e" Mar 08 05:59:35 crc kubenswrapper[4717]: I0308 05:59:35.356753 4717 scope.go:117] "RemoveContainer" containerID="a1b2881a85ed3fb26244d840424001ed464395dbc12b2d3b1484795815b8486a" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.170747 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.291687 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cgw8\" (UniqueName: \"kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8\") pod \"fbffff61-9614-4594-b52e-be489d2b2f22\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.291765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam\") pod \"fbffff61-9614-4594-b52e-be489d2b2f22\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.291876 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory\") pod \"fbffff61-9614-4594-b52e-be489d2b2f22\" (UID: \"fbffff61-9614-4594-b52e-be489d2b2f22\") " Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.308086 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8" (OuterVolumeSpecName: "kube-api-access-4cgw8") pod "fbffff61-9614-4594-b52e-be489d2b2f22" (UID: "fbffff61-9614-4594-b52e-be489d2b2f22"). InnerVolumeSpecName "kube-api-access-4cgw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.334270 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory" (OuterVolumeSpecName: "inventory") pod "fbffff61-9614-4594-b52e-be489d2b2f22" (UID: "fbffff61-9614-4594-b52e-be489d2b2f22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.345165 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fbffff61-9614-4594-b52e-be489d2b2f22" (UID: "fbffff61-9614-4594-b52e-be489d2b2f22"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.394213 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.394430 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cgw8\" (UniqueName: \"kubernetes.io/projected/fbffff61-9614-4594-b52e-be489d2b2f22-kube-api-access-4cgw8\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.394526 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbffff61-9614-4594-b52e-be489d2b2f22-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.729009 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" event={"ID":"fbffff61-9614-4594-b52e-be489d2b2f22","Type":"ContainerDied","Data":"5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9"} Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.729050 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbdb0ae639b395a1e7ba4993e48f2fa36405a09887bd506abfcf99779f16fa9" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.729328 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.840522 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w"] Mar 08 05:59:36 crc kubenswrapper[4717]: E0308 05:59:36.841131 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="extract-utilities" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841197 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="extract-utilities" Mar 08 05:59:36 crc kubenswrapper[4717]: E0308 05:59:36.841272 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbffff61-9614-4594-b52e-be489d2b2f22" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841332 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbffff61-9614-4594-b52e-be489d2b2f22" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:36 crc kubenswrapper[4717]: E0308 05:59:36.841401 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="registry-server" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841456 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="registry-server" Mar 08 05:59:36 crc kubenswrapper[4717]: E0308 05:59:36.841513 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="extract-content" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841561 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="extract-content" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841837 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbffff61-9614-4594-b52e-be489d2b2f22" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.841902 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="16141190-3d2b-4f32-9249-75104b5afdfa" containerName="registry-server" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.842591 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.845441 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.848156 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.848177 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.852375 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.861738 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w"] Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.904349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.904431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdb7\" (UniqueName: \"kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:36 crc kubenswrapper[4717]: I0308 05:59:36.904559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.013863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.014063 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.014160 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdb7\" (UniqueName: \"kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.022072 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.024344 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.049339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdb7\" (UniqueName: \"kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kc72w\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.162876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.529582 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w"] Mar 08 05:59:37 crc kubenswrapper[4717]: I0308 05:59:37.742675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" event={"ID":"f1249858-ba3a-4c6e-af8a-b7784e9795a0","Type":"ContainerStarted","Data":"9b15b7d3a8213bffa7e57d6e0c513e760b19d96cb0c2696d5546ff68a0e83409"} Mar 08 05:59:38 crc kubenswrapper[4717]: I0308 05:59:38.755495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" event={"ID":"f1249858-ba3a-4c6e-af8a-b7784e9795a0","Type":"ContainerStarted","Data":"58f73b997deeb70ca95ac99147933e49277ced80046dacc20a3e51c0e7be2d77"} Mar 08 05:59:38 crc kubenswrapper[4717]: I0308 05:59:38.782265 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" podStartSLOduration=2.299019461 podStartE2EDuration="2.782244098s" podCreationTimestamp="2026-03-08 05:59:36 +0000 UTC" firstStartedPulling="2026-03-08 05:59:37.536269349 +0000 UTC m=+2004.453918203" lastFinishedPulling="2026-03-08 05:59:38.019493996 +0000 UTC m=+2004.937142840" observedRunningTime="2026-03-08 05:59:38.771939656 +0000 UTC m=+2005.689588530" watchObservedRunningTime="2026-03-08 05:59:38.782244098 +0000 UTC m=+2005.699892962" Mar 08 05:59:43 crc kubenswrapper[4717]: I0308 05:59:43.832379 4717 generic.go:334] "Generic (PLEG): container finished" podID="f1249858-ba3a-4c6e-af8a-b7784e9795a0" containerID="58f73b997deeb70ca95ac99147933e49277ced80046dacc20a3e51c0e7be2d77" exitCode=0 Mar 08 05:59:43 crc kubenswrapper[4717]: I0308 05:59:43.832493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" event={"ID":"f1249858-ba3a-4c6e-af8a-b7784e9795a0","Type":"ContainerDied","Data":"58f73b997deeb70ca95ac99147933e49277ced80046dacc20a3e51c0e7be2d77"} Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.262712 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.408696 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmdb7\" (UniqueName: \"kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7\") pod \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.408807 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory\") pod \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.408858 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam\") pod \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\" (UID: \"f1249858-ba3a-4c6e-af8a-b7784e9795a0\") " Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.414504 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7" (OuterVolumeSpecName: "kube-api-access-qmdb7") pod "f1249858-ba3a-4c6e-af8a-b7784e9795a0" (UID: "f1249858-ba3a-4c6e-af8a-b7784e9795a0"). InnerVolumeSpecName "kube-api-access-qmdb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.444177 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory" (OuterVolumeSpecName: "inventory") pod "f1249858-ba3a-4c6e-af8a-b7784e9795a0" (UID: "f1249858-ba3a-4c6e-af8a-b7784e9795a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.457041 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f1249858-ba3a-4c6e-af8a-b7784e9795a0" (UID: "f1249858-ba3a-4c6e-af8a-b7784e9795a0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.510833 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmdb7\" (UniqueName: \"kubernetes.io/projected/f1249858-ba3a-4c6e-af8a-b7784e9795a0-kube-api-access-qmdb7\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.510869 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.510879 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f1249858-ba3a-4c6e-af8a-b7784e9795a0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.857771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" event={"ID":"f1249858-ba3a-4c6e-af8a-b7784e9795a0","Type":"ContainerDied","Data":"9b15b7d3a8213bffa7e57d6e0c513e760b19d96cb0c2696d5546ff68a0e83409"} Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.857816 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b15b7d3a8213bffa7e57d6e0c513e760b19d96cb0c2696d5546ff68a0e83409" Mar 08 05:59:45 crc kubenswrapper[4717]: I0308 05:59:45.857872 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kc72w" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.043207 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd"] Mar 08 05:59:46 crc kubenswrapper[4717]: E0308 05:59:46.043943 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1249858-ba3a-4c6e-af8a-b7784e9795a0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.044000 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1249858-ba3a-4c6e-af8a-b7784e9795a0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.044373 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1249858-ba3a-4c6e-af8a-b7784e9795a0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.045598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.048008 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.054704 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd"] Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.085197 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.085351 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.085475 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.126229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvbx\" (UniqueName: \"kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.126307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.126507 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.230449 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.231201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glvbx\" (UniqueName: \"kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.231271 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.247001 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.247354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.253420 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glvbx\" (UniqueName: \"kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5vnmd\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.401543 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 05:59:46 crc kubenswrapper[4717]: I0308 05:59:46.974745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd"] Mar 08 05:59:47 crc kubenswrapper[4717]: I0308 05:59:47.878452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" event={"ID":"ef634a99-41c2-496a-b06e-d697710e676d","Type":"ContainerStarted","Data":"62b4b28935a838d2d78818bde9b79894eeaf72149a79d3450eee33b95e5236c5"} Mar 08 05:59:47 crc kubenswrapper[4717]: I0308 05:59:47.880195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" event={"ID":"ef634a99-41c2-496a-b06e-d697710e676d","Type":"ContainerStarted","Data":"240437470f154e889f24de15e4501acd0dd9b8896d1ae23d78560b89ff40fdb8"} Mar 08 05:59:47 crc kubenswrapper[4717]: I0308 05:59:47.896887 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" podStartSLOduration=1.482184727 podStartE2EDuration="1.896873409s" podCreationTimestamp="2026-03-08 05:59:46 +0000 UTC" firstStartedPulling="2026-03-08 05:59:46.99161381 +0000 UTC m=+2013.909262654" lastFinishedPulling="2026-03-08 05:59:47.406302482 +0000 UTC m=+2014.323951336" observedRunningTime="2026-03-08 05:59:47.894593663 +0000 UTC m=+2014.812242537" watchObservedRunningTime="2026-03-08 05:59:47.896873409 +0000 UTC m=+2014.814522253" Mar 08 05:59:56 crc kubenswrapper[4717]: I0308 05:59:56.048796 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x6mmq"] Mar 08 05:59:56 crc kubenswrapper[4717]: I0308 05:59:56.062813 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x6mmq"] Mar 08 05:59:57 crc kubenswrapper[4717]: I0308 05:59:57.794916 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4" path="/var/lib/kubelet/pods/1dca6f6a-0ab8-4bb9-85df-9c1a3b98d1b4/volumes" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.144091 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549160-627gs"] Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.146970 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.150331 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.150386 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.150533 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.152711 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq"] Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.153953 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.156669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.156910 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.167285 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq"] Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.179435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549160-627gs"] Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.251647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4n4z\" (UniqueName: \"kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.251890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.251963 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.252062 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g677n\" (UniqueName: \"kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n\") pod \"auto-csr-approver-29549160-627gs\" (UID: \"28d83359-ba6f-45da-9eee-f3348558e8cb\") " pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.354321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.354400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g677n\" (UniqueName: \"kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n\") pod \"auto-csr-approver-29549160-627gs\" (UID: \"28d83359-ba6f-45da-9eee-f3348558e8cb\") " pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.354493 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4n4z\" (UniqueName: \"kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.354565 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.355604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.370938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.376384 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4n4z\" (UniqueName: \"kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z\") pod \"collect-profiles-29549160-v2bfq\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.377806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g677n\" (UniqueName: \"kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n\") pod \"auto-csr-approver-29549160-627gs\" (UID: \"28d83359-ba6f-45da-9eee-f3348558e8cb\") " pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.488389 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:00 crc kubenswrapper[4717]: I0308 06:00:00.505154 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:01 crc kubenswrapper[4717]: I0308 06:00:01.044198 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549160-627gs"] Mar 08 06:00:01 crc kubenswrapper[4717]: W0308 06:00:01.046784 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d83359_ba6f_45da_9eee_f3348558e8cb.slice/crio-bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9 WatchSource:0}: Error finding container bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9: Status 404 returned error can't find the container with id bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9 Mar 08 06:00:01 crc kubenswrapper[4717]: I0308 06:00:01.125069 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq"] Mar 08 06:00:01 crc kubenswrapper[4717]: W0308 06:00:01.127294 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82fed304_b5c5_403b_9152_5c683be53931.slice/crio-3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc WatchSource:0}: Error finding container 3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc: Status 404 returned error can't find the container with id 3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc Mar 08 06:00:02 crc kubenswrapper[4717]: I0308 06:00:02.030616 4717 generic.go:334] "Generic (PLEG): container finished" podID="82fed304-b5c5-403b-9152-5c683be53931" containerID="1d4a5ddcbe68fd34fff714ecbd16e34fab97da7dc904b651d56c310e294e8199" exitCode=0 Mar 08 06:00:02 crc kubenswrapper[4717]: I0308 06:00:02.030661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" event={"ID":"82fed304-b5c5-403b-9152-5c683be53931","Type":"ContainerDied","Data":"1d4a5ddcbe68fd34fff714ecbd16e34fab97da7dc904b651d56c310e294e8199"} Mar 08 06:00:02 crc kubenswrapper[4717]: I0308 06:00:02.030947 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" event={"ID":"82fed304-b5c5-403b-9152-5c683be53931","Type":"ContainerStarted","Data":"3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc"} Mar 08 06:00:02 crc kubenswrapper[4717]: I0308 06:00:02.032845 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549160-627gs" event={"ID":"28d83359-ba6f-45da-9eee-f3348558e8cb","Type":"ContainerStarted","Data":"bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9"} Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.473008 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.520496 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume\") pod \"82fed304-b5c5-403b-9152-5c683be53931\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.520563 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume\") pod \"82fed304-b5c5-403b-9152-5c683be53931\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.520821 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4n4z\" (UniqueName: \"kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z\") pod \"82fed304-b5c5-403b-9152-5c683be53931\" (UID: \"82fed304-b5c5-403b-9152-5c683be53931\") " Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.521585 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume" (OuterVolumeSpecName: "config-volume") pod "82fed304-b5c5-403b-9152-5c683be53931" (UID: "82fed304-b5c5-403b-9152-5c683be53931"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.527781 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82fed304-b5c5-403b-9152-5c683be53931" (UID: "82fed304-b5c5-403b-9152-5c683be53931"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.530423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z" (OuterVolumeSpecName: "kube-api-access-j4n4z") pod "82fed304-b5c5-403b-9152-5c683be53931" (UID: "82fed304-b5c5-403b-9152-5c683be53931"). InnerVolumeSpecName "kube-api-access-j4n4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.623609 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82fed304-b5c5-403b-9152-5c683be53931-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.623647 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82fed304-b5c5-403b-9152-5c683be53931-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:03 crc kubenswrapper[4717]: I0308 06:00:03.623658 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4n4z\" (UniqueName: \"kubernetes.io/projected/82fed304-b5c5-403b-9152-5c683be53931-kube-api-access-j4n4z\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:04 crc kubenswrapper[4717]: I0308 06:00:04.051947 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" event={"ID":"82fed304-b5c5-403b-9152-5c683be53931","Type":"ContainerDied","Data":"3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc"} Mar 08 06:00:04 crc kubenswrapper[4717]: I0308 06:00:04.051984 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3263adee3a52b4fa4af7ede1e5674ba74801e06bbb3638a90c6361ef4b50fc" Mar 08 06:00:04 crc kubenswrapper[4717]: I0308 06:00:04.052048 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq" Mar 08 06:00:05 crc kubenswrapper[4717]: I0308 06:00:05.071030 4717 generic.go:334] "Generic (PLEG): container finished" podID="28d83359-ba6f-45da-9eee-f3348558e8cb" containerID="30a76715bd60145e1352b51bd910145b766498ba839576ef8681ec20147c1d6f" exitCode=0 Mar 08 06:00:05 crc kubenswrapper[4717]: I0308 06:00:05.071151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549160-627gs" event={"ID":"28d83359-ba6f-45da-9eee-f3348558e8cb","Type":"ContainerDied","Data":"30a76715bd60145e1352b51bd910145b766498ba839576ef8681ec20147c1d6f"} Mar 08 06:00:06 crc kubenswrapper[4717]: I0308 06:00:06.559295 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:06 crc kubenswrapper[4717]: I0308 06:00:06.591139 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g677n\" (UniqueName: \"kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n\") pod \"28d83359-ba6f-45da-9eee-f3348558e8cb\" (UID: \"28d83359-ba6f-45da-9eee-f3348558e8cb\") " Mar 08 06:00:06 crc kubenswrapper[4717]: I0308 06:00:06.597338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n" (OuterVolumeSpecName: "kube-api-access-g677n") pod "28d83359-ba6f-45da-9eee-f3348558e8cb" (UID: "28d83359-ba6f-45da-9eee-f3348558e8cb"). InnerVolumeSpecName "kube-api-access-g677n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:00:06 crc kubenswrapper[4717]: I0308 06:00:06.693171 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g677n\" (UniqueName: \"kubernetes.io/projected/28d83359-ba6f-45da-9eee-f3348558e8cb-kube-api-access-g677n\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.096504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549160-627gs" event={"ID":"28d83359-ba6f-45da-9eee-f3348558e8cb","Type":"ContainerDied","Data":"bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9"} Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.096579 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb274fbe58782a3a2784fb67b70ee71fffc9ceb721ed457721dd5658728039d9" Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.096591 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549160-627gs" Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.648931 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549154-5zhm4"] Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.658786 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549154-5zhm4"] Mar 08 06:00:07 crc kubenswrapper[4717]: I0308 06:00:07.796896 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="447edb16-b654-4e0c-8a6a-88b08b829b9a" path="/var/lib/kubelet/pods/447edb16-b654-4e0c-8a6a-88b08b829b9a/volumes" Mar 08 06:00:18 crc kubenswrapper[4717]: I0308 06:00:18.048826 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgm67"] Mar 08 06:00:18 crc kubenswrapper[4717]: I0308 06:00:18.066757 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgm67"] Mar 08 06:00:19 crc kubenswrapper[4717]: I0308 06:00:19.793456 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ac0da5-7bbd-420a-b56d-60c621244d30" path="/var/lib/kubelet/pods/29ac0da5-7bbd-420a-b56d-60c621244d30/volumes" Mar 08 06:00:22 crc kubenswrapper[4717]: I0308 06:00:22.063844 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s2nrg"] Mar 08 06:00:22 crc kubenswrapper[4717]: I0308 06:00:22.078927 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s2nrg"] Mar 08 06:00:23 crc kubenswrapper[4717]: I0308 06:00:23.800617 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a798f5-2296-45b1-ad1e-5d31f85c67d3" path="/var/lib/kubelet/pods/84a798f5-2296-45b1-ad1e-5d31f85c67d3/volumes" Mar 08 06:00:26 crc kubenswrapper[4717]: I0308 06:00:26.349131 4717 generic.go:334] "Generic (PLEG): container finished" podID="ef634a99-41c2-496a-b06e-d697710e676d" containerID="62b4b28935a838d2d78818bde9b79894eeaf72149a79d3450eee33b95e5236c5" exitCode=0 Mar 08 06:00:26 crc kubenswrapper[4717]: I0308 06:00:26.349249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" event={"ID":"ef634a99-41c2-496a-b06e-d697710e676d","Type":"ContainerDied","Data":"62b4b28935a838d2d78818bde9b79894eeaf72149a79d3450eee33b95e5236c5"} Mar 08 06:00:27 crc kubenswrapper[4717]: I0308 06:00:27.887352 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.027009 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam\") pod \"ef634a99-41c2-496a-b06e-d697710e676d\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.027228 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glvbx\" (UniqueName: \"kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx\") pod \"ef634a99-41c2-496a-b06e-d697710e676d\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.027485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory\") pod \"ef634a99-41c2-496a-b06e-d697710e676d\" (UID: \"ef634a99-41c2-496a-b06e-d697710e676d\") " Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.044936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx" (OuterVolumeSpecName: "kube-api-access-glvbx") pod "ef634a99-41c2-496a-b06e-d697710e676d" (UID: "ef634a99-41c2-496a-b06e-d697710e676d"). InnerVolumeSpecName "kube-api-access-glvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.058128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory" (OuterVolumeSpecName: "inventory") pod "ef634a99-41c2-496a-b06e-d697710e676d" (UID: "ef634a99-41c2-496a-b06e-d697710e676d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.060793 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ef634a99-41c2-496a-b06e-d697710e676d" (UID: "ef634a99-41c2-496a-b06e-d697710e676d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.129864 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.129899 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef634a99-41c2-496a-b06e-d697710e676d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.129914 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glvbx\" (UniqueName: \"kubernetes.io/projected/ef634a99-41c2-496a-b06e-d697710e676d-kube-api-access-glvbx\") on node \"crc\" DevicePath \"\"" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.379364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" event={"ID":"ef634a99-41c2-496a-b06e-d697710e676d","Type":"ContainerDied","Data":"240437470f154e889f24de15e4501acd0dd9b8896d1ae23d78560b89ff40fdb8"} Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.379423 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="240437470f154e889f24de15e4501acd0dd9b8896d1ae23d78560b89ff40fdb8" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.379508 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5vnmd" Mar 08 06:00:28 crc kubenswrapper[4717]: E0308 06:00:28.496590 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef634a99_41c2_496a_b06e_d697710e676d.slice\": RecentStats: unable to find data in memory cache]" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.505521 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9"] Mar 08 06:00:28 crc kubenswrapper[4717]: E0308 06:00:28.517577 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d83359-ba6f-45da-9eee-f3348558e8cb" containerName="oc" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.517618 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d83359-ba6f-45da-9eee-f3348558e8cb" containerName="oc" Mar 08 06:00:28 crc kubenswrapper[4717]: E0308 06:00:28.517716 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef634a99-41c2-496a-b06e-d697710e676d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.517732 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef634a99-41c2-496a-b06e-d697710e676d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:00:28 crc kubenswrapper[4717]: E0308 06:00:28.517765 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fed304-b5c5-403b-9152-5c683be53931" containerName="collect-profiles" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.517776 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fed304-b5c5-403b-9152-5c683be53931" containerName="collect-profiles" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.518465 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="82fed304-b5c5-403b-9152-5c683be53931" containerName="collect-profiles" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.518514 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef634a99-41c2-496a-b06e-d697710e676d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.518536 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d83359-ba6f-45da-9eee-f3348558e8cb" containerName="oc" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.519858 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.527131 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.529181 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.529235 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.529520 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.532110 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9"] Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.647292 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcxc\" (UniqueName: \"kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.647861 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.648089 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.750262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.750465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcxc\" (UniqueName: \"kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.750504 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.757787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.758469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.772000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcxc\" (UniqueName: \"kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:28 crc kubenswrapper[4717]: I0308 06:00:28.848641 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:00:29 crc kubenswrapper[4717]: W0308 06:00:29.449925 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9973d9b_7167_4a1b_9115_a7fb9a2921c3.slice/crio-e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3 WatchSource:0}: Error finding container e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3: Status 404 returned error can't find the container with id e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3 Mar 08 06:00:29 crc kubenswrapper[4717]: I0308 06:00:29.460322 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9"] Mar 08 06:00:30 crc kubenswrapper[4717]: I0308 06:00:30.410135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" event={"ID":"d9973d9b-7167-4a1b-9115-a7fb9a2921c3","Type":"ContainerStarted","Data":"b68f4100510afa4f286792d69992a8ec7c10e220f070dd94cf74fe6e0dbc069f"} Mar 08 06:00:30 crc kubenswrapper[4717]: I0308 06:00:30.410503 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" event={"ID":"d9973d9b-7167-4a1b-9115-a7fb9a2921c3","Type":"ContainerStarted","Data":"e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3"} Mar 08 06:00:30 crc kubenswrapper[4717]: I0308 06:00:30.445836 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" podStartSLOduration=1.975135076 podStartE2EDuration="2.445809236s" podCreationTimestamp="2026-03-08 06:00:28 +0000 UTC" firstStartedPulling="2026-03-08 06:00:29.452587576 +0000 UTC m=+2056.370236460" lastFinishedPulling="2026-03-08 06:00:29.923261766 +0000 UTC m=+2056.840910620" observedRunningTime="2026-03-08 06:00:30.437579394 +0000 UTC m=+2057.355228268" watchObservedRunningTime="2026-03-08 06:00:30.445809236 +0000 UTC m=+2057.363458120" Mar 08 06:00:34 crc kubenswrapper[4717]: I0308 06:00:34.119827 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:00:34 crc kubenswrapper[4717]: I0308 06:00:34.121995 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:00:35 crc kubenswrapper[4717]: I0308 06:00:35.506247 4717 scope.go:117] "RemoveContainer" containerID="0a66c7f0fe537f6e65ae7177f15bb0f6afa0ed616a621d68f28485607d40f336" Mar 08 06:00:35 crc kubenswrapper[4717]: I0308 06:00:35.566574 4717 scope.go:117] "RemoveContainer" containerID="97368cb953e6d8fae4e88a61a4a2fd4420986ca65ff922b3047eab800868691f" Mar 08 06:00:35 crc kubenswrapper[4717]: I0308 06:00:35.637319 4717 scope.go:117] "RemoveContainer" containerID="00de9b162608c5f1411ab6e99a593f66f0e3e33b85a7f00a215e414c16aa2735" Mar 08 06:00:35 crc kubenswrapper[4717]: I0308 06:00:35.686704 4717 scope.go:117] "RemoveContainer" containerID="1d9cd15406bfd90dcaf8552ffc44b8b4eeb86577a42d5015e284cee5025389c1" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.192657 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29549161-72gqq"] Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.197019 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.219383 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549161-72gqq"] Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.307883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.307953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.308200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjwb\" (UniqueName: \"kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.308328 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.410929 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.411007 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.411154 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjwb\" (UniqueName: \"kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.411227 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.418514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.418762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.419735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.454616 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjwb\" (UniqueName: \"kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb\") pod \"keystone-cron-29549161-72gqq\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.532491 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:00 crc kubenswrapper[4717]: I0308 06:01:00.848971 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549161-72gqq"] Mar 08 06:01:01 crc kubenswrapper[4717]: I0308 06:01:01.779809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549161-72gqq" event={"ID":"8a320a32-4b25-423c-9e3c-5ca2d08652c5","Type":"ContainerStarted","Data":"8b0b3208b62965656c8652839166d5f943d0e88103ffdf01f1fc840922d2afc4"} Mar 08 06:01:01 crc kubenswrapper[4717]: I0308 06:01:01.803910 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549161-72gqq" event={"ID":"8a320a32-4b25-423c-9e3c-5ca2d08652c5","Type":"ContainerStarted","Data":"311c30ac920157dd8fc6a47c03e774d69edaba7f32bbf2cbb7b4a8aeac721f9e"} Mar 08 06:01:01 crc kubenswrapper[4717]: I0308 06:01:01.822620 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29549161-72gqq" podStartSLOduration=1.822588551 podStartE2EDuration="1.822588551s" podCreationTimestamp="2026-03-08 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 06:01:01.804882726 +0000 UTC m=+2088.722531600" watchObservedRunningTime="2026-03-08 06:01:01.822588551 +0000 UTC m=+2088.740237435" Mar 08 06:01:02 crc kubenswrapper[4717]: I0308 06:01:02.053800 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hfkcg"] Mar 08 06:01:02 crc kubenswrapper[4717]: I0308 06:01:02.075470 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hfkcg"] Mar 08 06:01:03 crc kubenswrapper[4717]: I0308 06:01:03.817042 4717 generic.go:334] "Generic (PLEG): container finished" podID="8a320a32-4b25-423c-9e3c-5ca2d08652c5" containerID="8b0b3208b62965656c8652839166d5f943d0e88103ffdf01f1fc840922d2afc4" exitCode=0 Mar 08 06:01:03 crc kubenswrapper[4717]: I0308 06:01:03.829318 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5a9dec-59bb-4422-b37d-f0fe159f82d4" path="/var/lib/kubelet/pods/8d5a9dec-59bb-4422-b37d-f0fe159f82d4/volumes" Mar 08 06:01:03 crc kubenswrapper[4717]: I0308 06:01:03.829915 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549161-72gqq" event={"ID":"8a320a32-4b25-423c-9e3c-5ca2d08652c5","Type":"ContainerDied","Data":"8b0b3208b62965656c8652839166d5f943d0e88103ffdf01f1fc840922d2afc4"} Mar 08 06:01:04 crc kubenswrapper[4717]: I0308 06:01:04.120126 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:01:04 crc kubenswrapper[4717]: I0308 06:01:04.120213 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.304576 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.428911 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data\") pod \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.428994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys\") pod \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.429183 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle\") pod \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.429235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfjwb\" (UniqueName: \"kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb\") pod \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\" (UID: \"8a320a32-4b25-423c-9e3c-5ca2d08652c5\") " Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.435395 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb" (OuterVolumeSpecName: "kube-api-access-wfjwb") pod "8a320a32-4b25-423c-9e3c-5ca2d08652c5" (UID: "8a320a32-4b25-423c-9e3c-5ca2d08652c5"). InnerVolumeSpecName "kube-api-access-wfjwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.436869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8a320a32-4b25-423c-9e3c-5ca2d08652c5" (UID: "8a320a32-4b25-423c-9e3c-5ca2d08652c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.458776 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a320a32-4b25-423c-9e3c-5ca2d08652c5" (UID: "8a320a32-4b25-423c-9e3c-5ca2d08652c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.507448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data" (OuterVolumeSpecName: "config-data") pod "8a320a32-4b25-423c-9e3c-5ca2d08652c5" (UID: "8a320a32-4b25-423c-9e3c-5ca2d08652c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.532269 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.532315 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.532334 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a320a32-4b25-423c-9e3c-5ca2d08652c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.532351 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfjwb\" (UniqueName: \"kubernetes.io/projected/8a320a32-4b25-423c-9e3c-5ca2d08652c5-kube-api-access-wfjwb\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.843215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549161-72gqq" event={"ID":"8a320a32-4b25-423c-9e3c-5ca2d08652c5","Type":"ContainerDied","Data":"311c30ac920157dd8fc6a47c03e774d69edaba7f32bbf2cbb7b4a8aeac721f9e"} Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.843570 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="311c30ac920157dd8fc6a47c03e774d69edaba7f32bbf2cbb7b4a8aeac721f9e" Mar 08 06:01:05 crc kubenswrapper[4717]: I0308 06:01:05.843306 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549161-72gqq" Mar 08 06:01:25 crc kubenswrapper[4717]: I0308 06:01:25.081923 4717 generic.go:334] "Generic (PLEG): container finished" podID="d9973d9b-7167-4a1b-9115-a7fb9a2921c3" containerID="b68f4100510afa4f286792d69992a8ec7c10e220f070dd94cf74fe6e0dbc069f" exitCode=0 Mar 08 06:01:25 crc kubenswrapper[4717]: I0308 06:01:25.082085 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" event={"ID":"d9973d9b-7167-4a1b-9115-a7fb9a2921c3","Type":"ContainerDied","Data":"b68f4100510afa4f286792d69992a8ec7c10e220f070dd94cf74fe6e0dbc069f"} Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.714925 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.870530 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptcxc\" (UniqueName: \"kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc\") pod \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.870668 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory\") pod \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.870824 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam\") pod \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\" (UID: \"d9973d9b-7167-4a1b-9115-a7fb9a2921c3\") " Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.877815 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc" (OuterVolumeSpecName: "kube-api-access-ptcxc") pod "d9973d9b-7167-4a1b-9115-a7fb9a2921c3" (UID: "d9973d9b-7167-4a1b-9115-a7fb9a2921c3"). InnerVolumeSpecName "kube-api-access-ptcxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.902804 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory" (OuterVolumeSpecName: "inventory") pod "d9973d9b-7167-4a1b-9115-a7fb9a2921c3" (UID: "d9973d9b-7167-4a1b-9115-a7fb9a2921c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.905894 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d9973d9b-7167-4a1b-9115-a7fb9a2921c3" (UID: "d9973d9b-7167-4a1b-9115-a7fb9a2921c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.975410 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.975459 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:26 crc kubenswrapper[4717]: I0308 06:01:26.975482 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptcxc\" (UniqueName: \"kubernetes.io/projected/d9973d9b-7167-4a1b-9115-a7fb9a2921c3-kube-api-access-ptcxc\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.101782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" event={"ID":"d9973d9b-7167-4a1b-9115-a7fb9a2921c3","Type":"ContainerDied","Data":"e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3"} Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.101820 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e286130dbeb706b4fc9c5d7c965c6bb7ef8bfc3c80ade287e5fc15537b4f19c3" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.101855 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.194997 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz2kg"] Mar 08 06:01:27 crc kubenswrapper[4717]: E0308 06:01:27.195436 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a320a32-4b25-423c-9e3c-5ca2d08652c5" containerName="keystone-cron" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.195454 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a320a32-4b25-423c-9e3c-5ca2d08652c5" containerName="keystone-cron" Mar 08 06:01:27 crc kubenswrapper[4717]: E0308 06:01:27.195469 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9973d9b-7167-4a1b-9115-a7fb9a2921c3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.195479 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9973d9b-7167-4a1b-9115-a7fb9a2921c3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.195725 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9973d9b-7167-4a1b-9115-a7fb9a2921c3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.195770 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a320a32-4b25-423c-9e3c-5ca2d08652c5" containerName="keystone-cron" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.196529 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.199107 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.199289 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.199300 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.199602 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.212842 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz2kg"] Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.281422 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg7mc\" (UniqueName: \"kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.281478 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.281749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.383645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg7mc\" (UniqueName: \"kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.383724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.383758 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.389377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.389417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.402963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg7mc\" (UniqueName: \"kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc\") pod \"ssh-known-hosts-edpm-deployment-dz2kg\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:27 crc kubenswrapper[4717]: I0308 06:01:27.513586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:28 crc kubenswrapper[4717]: I0308 06:01:28.092324 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz2kg"] Mar 08 06:01:28 crc kubenswrapper[4717]: I0308 06:01:28.113347 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" event={"ID":"d4a65517-456b-4bf4-9e4b-cea94baeb6a7","Type":"ContainerStarted","Data":"b0ffd6636cc707ca9424ef62cbe6d2d6bfedb50f7b825d7954a3b154348b78e9"} Mar 08 06:01:29 crc kubenswrapper[4717]: I0308 06:01:29.123980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" event={"ID":"d4a65517-456b-4bf4-9e4b-cea94baeb6a7","Type":"ContainerStarted","Data":"fef633465525ff19549b9c3c44aa25f937ed55794ade4402ef915bf81e51a3b6"} Mar 08 06:01:29 crc kubenswrapper[4717]: I0308 06:01:29.156426 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" podStartSLOduration=1.584528545 podStartE2EDuration="2.15640076s" podCreationTimestamp="2026-03-08 06:01:27 +0000 UTC" firstStartedPulling="2026-03-08 06:01:28.096970411 +0000 UTC m=+2115.014619265" lastFinishedPulling="2026-03-08 06:01:28.668842596 +0000 UTC m=+2115.586491480" observedRunningTime="2026-03-08 06:01:29.142042467 +0000 UTC m=+2116.059691321" watchObservedRunningTime="2026-03-08 06:01:29.15640076 +0000 UTC m=+2116.074049604" Mar 08 06:01:34 crc kubenswrapper[4717]: I0308 06:01:34.120239 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:01:34 crc kubenswrapper[4717]: I0308 06:01:34.120556 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:01:34 crc kubenswrapper[4717]: I0308 06:01:34.120602 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:01:34 crc kubenswrapper[4717]: I0308 06:01:34.121478 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:01:34 crc kubenswrapper[4717]: I0308 06:01:34.121546 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da" gracePeriod=600 Mar 08 06:01:35 crc kubenswrapper[4717]: I0308 06:01:35.214157 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da" exitCode=0 Mar 08 06:01:35 crc kubenswrapper[4717]: I0308 06:01:35.214230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da"} Mar 08 06:01:35 crc kubenswrapper[4717]: I0308 06:01:35.214719 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e"} Mar 08 06:01:35 crc kubenswrapper[4717]: I0308 06:01:35.214748 4717 scope.go:117] "RemoveContainer" containerID="e5fd441ddebfc81f662ec3ceac1a36a5198bcf60f235b39862ccf75de969016b" Mar 08 06:01:35 crc kubenswrapper[4717]: I0308 06:01:35.841141 4717 scope.go:117] "RemoveContainer" containerID="4d607c4bdb313b66bd7a7fc1793f91f8ed731f74c7918eca1bf97c4e2e7278c7" Mar 08 06:01:36 crc kubenswrapper[4717]: I0308 06:01:36.237308 4717 generic.go:334] "Generic (PLEG): container finished" podID="d4a65517-456b-4bf4-9e4b-cea94baeb6a7" containerID="fef633465525ff19549b9c3c44aa25f937ed55794ade4402ef915bf81e51a3b6" exitCode=0 Mar 08 06:01:36 crc kubenswrapper[4717]: I0308 06:01:36.237377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" event={"ID":"d4a65517-456b-4bf4-9e4b-cea94baeb6a7","Type":"ContainerDied","Data":"fef633465525ff19549b9c3c44aa25f937ed55794ade4402ef915bf81e51a3b6"} Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.772266 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.918288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg7mc\" (UniqueName: \"kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc\") pod \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.918647 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0\") pod \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.918762 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam\") pod \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\" (UID: \"d4a65517-456b-4bf4-9e4b-cea94baeb6a7\") " Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.929978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc" (OuterVolumeSpecName: "kube-api-access-wg7mc") pod "d4a65517-456b-4bf4-9e4b-cea94baeb6a7" (UID: "d4a65517-456b-4bf4-9e4b-cea94baeb6a7"). InnerVolumeSpecName "kube-api-access-wg7mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.952916 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d4a65517-456b-4bf4-9e4b-cea94baeb6a7" (UID: "d4a65517-456b-4bf4-9e4b-cea94baeb6a7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:37 crc kubenswrapper[4717]: I0308 06:01:37.954820 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d4a65517-456b-4bf4-9e4b-cea94baeb6a7" (UID: "d4a65517-456b-4bf4-9e4b-cea94baeb6a7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.020978 4717 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.021010 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.021021 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg7mc\" (UniqueName: \"kubernetes.io/projected/d4a65517-456b-4bf4-9e4b-cea94baeb6a7-kube-api-access-wg7mc\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.257387 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" event={"ID":"d4a65517-456b-4bf4-9e4b-cea94baeb6a7","Type":"ContainerDied","Data":"b0ffd6636cc707ca9424ef62cbe6d2d6bfedb50f7b825d7954a3b154348b78e9"} Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.257442 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ffd6636cc707ca9424ef62cbe6d2d6bfedb50f7b825d7954a3b154348b78e9" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.257493 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz2kg" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.377588 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp"] Mar 08 06:01:38 crc kubenswrapper[4717]: E0308 06:01:38.378071 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a65517-456b-4bf4-9e4b-cea94baeb6a7" containerName="ssh-known-hosts-edpm-deployment" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.378090 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a65517-456b-4bf4-9e4b-cea94baeb6a7" containerName="ssh-known-hosts-edpm-deployment" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.378479 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a65517-456b-4bf4-9e4b-cea94baeb6a7" containerName="ssh-known-hosts-edpm-deployment" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.379293 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.382646 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.382665 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.383084 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.383244 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.403895 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp"] Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.532940 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkg8g\" (UniqueName: \"kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.533254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.533290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.635340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.635430 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.635567 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkg8g\" (UniqueName: \"kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.640234 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.645720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.652770 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkg8g\" (UniqueName: \"kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hdxjp\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:38 crc kubenswrapper[4717]: I0308 06:01:38.709269 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:39 crc kubenswrapper[4717]: I0308 06:01:39.300620 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp"] Mar 08 06:01:39 crc kubenswrapper[4717]: W0308 06:01:39.309391 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode60e7e38_c8ac_4b48_bfc0_04e5b8e56874.slice/crio-6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab WatchSource:0}: Error finding container 6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab: Status 404 returned error can't find the container with id 6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab Mar 08 06:01:40 crc kubenswrapper[4717]: I0308 06:01:40.300452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" event={"ID":"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874","Type":"ContainerStarted","Data":"e8bfa0e524315a43f4ac7677ea51247d5592e55e16a8454889dd818c2a239499"} Mar 08 06:01:40 crc kubenswrapper[4717]: I0308 06:01:40.300849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" event={"ID":"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874","Type":"ContainerStarted","Data":"6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab"} Mar 08 06:01:50 crc kubenswrapper[4717]: I0308 06:01:50.418246 4717 generic.go:334] "Generic (PLEG): container finished" podID="e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" containerID="e8bfa0e524315a43f4ac7677ea51247d5592e55e16a8454889dd818c2a239499" exitCode=0 Mar 08 06:01:50 crc kubenswrapper[4717]: I0308 06:01:50.418359 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" event={"ID":"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874","Type":"ContainerDied","Data":"e8bfa0e524315a43f4ac7677ea51247d5592e55e16a8454889dd818c2a239499"} Mar 08 06:01:51 crc kubenswrapper[4717]: I0308 06:01:51.909110 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.063591 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory\") pod \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.064203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkg8g\" (UniqueName: \"kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g\") pod \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.064246 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam\") pod \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\" (UID: \"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874\") " Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.070339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g" (OuterVolumeSpecName: "kube-api-access-dkg8g") pod "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" (UID: "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874"). InnerVolumeSpecName "kube-api-access-dkg8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.110527 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory" (OuterVolumeSpecName: "inventory") pod "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" (UID: "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.126868 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" (UID: "e60e7e38-c8ac-4b48-bfc0-04e5b8e56874"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.166749 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkg8g\" (UniqueName: \"kubernetes.io/projected/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-kube-api-access-dkg8g\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.166778 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.166788 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e60e7e38-c8ac-4b48-bfc0-04e5b8e56874-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.444591 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" event={"ID":"e60e7e38-c8ac-4b48-bfc0-04e5b8e56874","Type":"ContainerDied","Data":"6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab"} Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.444640 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e19c6d76af1e11549fc3df450bc54a540145f4578233f82d6bc085ff1672bab" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.444679 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hdxjp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.513804 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp"] Mar 08 06:01:52 crc kubenswrapper[4717]: E0308 06:01:52.514353 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.514372 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.514587 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60e7e38-c8ac-4b48-bfc0-04e5b8e56874" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.515225 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.516979 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.517328 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.517758 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.520148 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.529362 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp"] Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.574207 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.574396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxpg\" (UniqueName: \"kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.574462 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.676305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxpg\" (UniqueName: \"kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.676369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.676421 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.681984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.683433 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.751223 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxpg\" (UniqueName: \"kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:52 crc kubenswrapper[4717]: I0308 06:01:52.835142 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:01:53 crc kubenswrapper[4717]: I0308 06:01:53.524463 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp"] Mar 08 06:01:54 crc kubenswrapper[4717]: I0308 06:01:54.462551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" event={"ID":"228cb615-a265-435c-bca0-5cb037e311b6","Type":"ContainerStarted","Data":"cbdd3320e56a6fd10145b6d8257649880ca1ef12564ee0c7478378073d02cf56"} Mar 08 06:01:55 crc kubenswrapper[4717]: I0308 06:01:55.475253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" event={"ID":"228cb615-a265-435c-bca0-5cb037e311b6","Type":"ContainerStarted","Data":"e819a48a58b21c677c4cd96e77d0bb874f91b51704bb0723e76d4240df9f8823"} Mar 08 06:01:55 crc kubenswrapper[4717]: I0308 06:01:55.505066 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" podStartSLOduration=2.837111968 podStartE2EDuration="3.505036892s" podCreationTimestamp="2026-03-08 06:01:52 +0000 UTC" firstStartedPulling="2026-03-08 06:01:53.532655912 +0000 UTC m=+2140.450304756" lastFinishedPulling="2026-03-08 06:01:54.200580826 +0000 UTC m=+2141.118229680" observedRunningTime="2026-03-08 06:01:55.491712905 +0000 UTC m=+2142.409361769" watchObservedRunningTime="2026-03-08 06:01:55.505036892 +0000 UTC m=+2142.422685776" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.145707 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549162-lxg8r"] Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.148383 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.150379 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.150396 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.150922 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.176278 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549162-lxg8r"] Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.275842 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ggh\" (UniqueName: \"kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh\") pod \"auto-csr-approver-29549162-lxg8r\" (UID: \"42305074-feb7-40c9-91c1-e9a34b11a20b\") " pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.377871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ggh\" (UniqueName: \"kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh\") pod \"auto-csr-approver-29549162-lxg8r\" (UID: \"42305074-feb7-40c9-91c1-e9a34b11a20b\") " pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.413396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ggh\" (UniqueName: \"kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh\") pod \"auto-csr-approver-29549162-lxg8r\" (UID: \"42305074-feb7-40c9-91c1-e9a34b11a20b\") " pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.479668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:00 crc kubenswrapper[4717]: I0308 06:02:00.940458 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549162-lxg8r"] Mar 08 06:02:01 crc kubenswrapper[4717]: I0308 06:02:01.528283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" event={"ID":"42305074-feb7-40c9-91c1-e9a34b11a20b","Type":"ContainerStarted","Data":"9f59384b2d3a65011e3a01255e781a01765ccb97d13db769090e28f06e79a3fe"} Mar 08 06:02:02 crc kubenswrapper[4717]: I0308 06:02:02.540445 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" event={"ID":"42305074-feb7-40c9-91c1-e9a34b11a20b","Type":"ContainerStarted","Data":"4305ce0069088ea09abb95ef5fb0eeb989a57c756ec7545c137c3e3256942c5d"} Mar 08 06:02:02 crc kubenswrapper[4717]: I0308 06:02:02.568218 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" podStartSLOduration=1.493135729 podStartE2EDuration="2.568198181s" podCreationTimestamp="2026-03-08 06:02:00 +0000 UTC" firstStartedPulling="2026-03-08 06:02:00.953736361 +0000 UTC m=+2147.871385205" lastFinishedPulling="2026-03-08 06:02:02.028798773 +0000 UTC m=+2148.946447657" observedRunningTime="2026-03-08 06:02:02.559261761 +0000 UTC m=+2149.476910605" watchObservedRunningTime="2026-03-08 06:02:02.568198181 +0000 UTC m=+2149.485847045" Mar 08 06:02:03 crc kubenswrapper[4717]: I0308 06:02:03.552350 4717 generic.go:334] "Generic (PLEG): container finished" podID="42305074-feb7-40c9-91c1-e9a34b11a20b" containerID="4305ce0069088ea09abb95ef5fb0eeb989a57c756ec7545c137c3e3256942c5d" exitCode=0 Mar 08 06:02:03 crc kubenswrapper[4717]: I0308 06:02:03.552474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" event={"ID":"42305074-feb7-40c9-91c1-e9a34b11a20b","Type":"ContainerDied","Data":"4305ce0069088ea09abb95ef5fb0eeb989a57c756ec7545c137c3e3256942c5d"} Mar 08 06:02:04 crc kubenswrapper[4717]: I0308 06:02:04.567137 4717 generic.go:334] "Generic (PLEG): container finished" podID="228cb615-a265-435c-bca0-5cb037e311b6" containerID="e819a48a58b21c677c4cd96e77d0bb874f91b51704bb0723e76d4240df9f8823" exitCode=0 Mar 08 06:02:04 crc kubenswrapper[4717]: I0308 06:02:04.567204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" event={"ID":"228cb615-a265-435c-bca0-5cb037e311b6","Type":"ContainerDied","Data":"e819a48a58b21c677c4cd96e77d0bb874f91b51704bb0723e76d4240df9f8823"} Mar 08 06:02:04 crc kubenswrapper[4717]: I0308 06:02:04.967321 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:04 crc kubenswrapper[4717]: I0308 06:02:04.983952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ggh\" (UniqueName: \"kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh\") pod \"42305074-feb7-40c9-91c1-e9a34b11a20b\" (UID: \"42305074-feb7-40c9-91c1-e9a34b11a20b\") " Mar 08 06:02:04 crc kubenswrapper[4717]: I0308 06:02:04.990977 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh" (OuterVolumeSpecName: "kube-api-access-j4ggh") pod "42305074-feb7-40c9-91c1-e9a34b11a20b" (UID: "42305074-feb7-40c9-91c1-e9a34b11a20b"). InnerVolumeSpecName "kube-api-access-j4ggh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.087140 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ggh\" (UniqueName: \"kubernetes.io/projected/42305074-feb7-40c9-91c1-e9a34b11a20b-kube-api-access-j4ggh\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.581841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" event={"ID":"42305074-feb7-40c9-91c1-e9a34b11a20b","Type":"ContainerDied","Data":"9f59384b2d3a65011e3a01255e781a01765ccb97d13db769090e28f06e79a3fe"} Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.581898 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f59384b2d3a65011e3a01255e781a01765ccb97d13db769090e28f06e79a3fe" Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.581924 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549162-lxg8r" Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.663166 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549156-xd45c"] Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.677431 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549156-xd45c"] Mar 08 06:02:05 crc kubenswrapper[4717]: I0308 06:02:05.793794 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a115a6-c45f-4247-baa4-97af8465ac5c" path="/var/lib/kubelet/pods/36a115a6-c45f-4247-baa4-97af8465ac5c/volumes" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.073894 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.108663 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfxpg\" (UniqueName: \"kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg\") pod \"228cb615-a265-435c-bca0-5cb037e311b6\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.108724 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam\") pod \"228cb615-a265-435c-bca0-5cb037e311b6\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.109000 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory\") pod \"228cb615-a265-435c-bca0-5cb037e311b6\" (UID: \"228cb615-a265-435c-bca0-5cb037e311b6\") " Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.125857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg" (OuterVolumeSpecName: "kube-api-access-pfxpg") pod "228cb615-a265-435c-bca0-5cb037e311b6" (UID: "228cb615-a265-435c-bca0-5cb037e311b6"). InnerVolumeSpecName "kube-api-access-pfxpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.163926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory" (OuterVolumeSpecName: "inventory") pod "228cb615-a265-435c-bca0-5cb037e311b6" (UID: "228cb615-a265-435c-bca0-5cb037e311b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.184845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "228cb615-a265-435c-bca0-5cb037e311b6" (UID: "228cb615-a265-435c-bca0-5cb037e311b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.212006 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.212046 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfxpg\" (UniqueName: \"kubernetes.io/projected/228cb615-a265-435c-bca0-5cb037e311b6-kube-api-access-pfxpg\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.212061 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228cb615-a265-435c-bca0-5cb037e311b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.593761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" event={"ID":"228cb615-a265-435c-bca0-5cb037e311b6","Type":"ContainerDied","Data":"cbdd3320e56a6fd10145b6d8257649880ca1ef12564ee0c7478378073d02cf56"} Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.593809 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbdd3320e56a6fd10145b6d8257649880ca1ef12564ee0c7478378073d02cf56" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.593905 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.689233 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw"] Mar 08 06:02:06 crc kubenswrapper[4717]: E0308 06:02:06.689706 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228cb615-a265-435c-bca0-5cb037e311b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.689726 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="228cb615-a265-435c-bca0-5cb037e311b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:06 crc kubenswrapper[4717]: E0308 06:02:06.689740 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42305074-feb7-40c9-91c1-e9a34b11a20b" containerName="oc" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.689748 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="42305074-feb7-40c9-91c1-e9a34b11a20b" containerName="oc" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.690012 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="228cb615-a265-435c-bca0-5cb037e311b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.690031 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="42305074-feb7-40c9-91c1-e9a34b11a20b" containerName="oc" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.690820 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.700010 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw"] Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732181 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732177 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732430 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732509 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732649 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732751 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732834 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.732932 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.833504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.833563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.833585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.833982 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834211 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834299 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834590 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834710 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.834749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k2w2\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.936751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937131 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937434 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937500 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937554 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.937583 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k2w2\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.941976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.942492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.942613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.945094 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.945359 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.945446 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.946316 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.948002 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.948051 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.948794 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.948826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.952070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.952278 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:06 crc kubenswrapper[4717]: I0308 06:02:06.969276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k2w2\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.046719 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.288974 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.294597 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.312821 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.448121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whd4k\" (UniqueName: \"kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.448211 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.448498 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.550937 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whd4k\" (UniqueName: \"kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.551004 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.551102 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.551630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.551651 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:07 crc kubenswrapper[4717]: I0308 06:02:07.573414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whd4k\" (UniqueName: \"kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k\") pod \"redhat-operators-j6qt5\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:07.616354 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:07.684073 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw"] Mar 08 06:02:08 crc kubenswrapper[4717]: W0308 06:02:07.705151 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534637bd_8579_46f3_bee7_d6270aa8130c.slice/crio-d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d WatchSource:0}: Error finding container d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d: Status 404 returned error can't find the container with id d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:08.610444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" event={"ID":"534637bd-8579-46f3-bee7-d6270aa8130c","Type":"ContainerStarted","Data":"9d1d729bb3eb2a531aac971fd442e8cddb8e934ced467d054736c762164a7c93"} Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:08.610495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" event={"ID":"534637bd-8579-46f3-bee7-d6270aa8130c","Type":"ContainerStarted","Data":"d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d"} Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:08.645426 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:08 crc kubenswrapper[4717]: I0308 06:02:08.645826 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" podStartSLOduration=2.212139424 podStartE2EDuration="2.645814005s" podCreationTimestamp="2026-03-08 06:02:06 +0000 UTC" firstStartedPulling="2026-03-08 06:02:07.709643183 +0000 UTC m=+2154.627292027" lastFinishedPulling="2026-03-08 06:02:08.143317764 +0000 UTC m=+2155.060966608" observedRunningTime="2026-03-08 06:02:08.643798976 +0000 UTC m=+2155.561447820" watchObservedRunningTime="2026-03-08 06:02:08.645814005 +0000 UTC m=+2155.563462849" Mar 08 06:02:09 crc kubenswrapper[4717]: I0308 06:02:09.630403 4717 generic.go:334] "Generic (PLEG): container finished" podID="08d8a178-2929-4523-a506-b9e9417a2fae" containerID="ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce" exitCode=0 Mar 08 06:02:09 crc kubenswrapper[4717]: I0308 06:02:09.630562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerDied","Data":"ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce"} Mar 08 06:02:09 crc kubenswrapper[4717]: I0308 06:02:09.631204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerStarted","Data":"893a8c8f6d5d7c31e00c94586ac216372393530c682883066b9fe512b20fc93b"} Mar 08 06:02:11 crc kubenswrapper[4717]: I0308 06:02:11.657335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerStarted","Data":"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8"} Mar 08 06:02:13 crc kubenswrapper[4717]: I0308 06:02:13.681396 4717 generic.go:334] "Generic (PLEG): container finished" podID="08d8a178-2929-4523-a506-b9e9417a2fae" containerID="f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8" exitCode=0 Mar 08 06:02:13 crc kubenswrapper[4717]: I0308 06:02:13.681475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerDied","Data":"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8"} Mar 08 06:02:14 crc kubenswrapper[4717]: I0308 06:02:14.697225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerStarted","Data":"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580"} Mar 08 06:02:14 crc kubenswrapper[4717]: I0308 06:02:14.724852 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6qt5" podStartSLOduration=3.207076379 podStartE2EDuration="7.724824253s" podCreationTimestamp="2026-03-08 06:02:07 +0000 UTC" firstStartedPulling="2026-03-08 06:02:09.633771629 +0000 UTC m=+2156.551420483" lastFinishedPulling="2026-03-08 06:02:14.151519493 +0000 UTC m=+2161.069168357" observedRunningTime="2026-03-08 06:02:14.715043923 +0000 UTC m=+2161.632692777" watchObservedRunningTime="2026-03-08 06:02:14.724824253 +0000 UTC m=+2161.642473167" Mar 08 06:02:17 crc kubenswrapper[4717]: I0308 06:02:17.616638 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:17 crc kubenswrapper[4717]: I0308 06:02:17.616934 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:18 crc kubenswrapper[4717]: I0308 06:02:18.671687 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6qt5" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" probeResult="failure" output=< Mar 08 06:02:18 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:02:18 crc kubenswrapper[4717]: > Mar 08 06:02:28 crc kubenswrapper[4717]: I0308 06:02:28.669396 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6qt5" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" probeResult="failure" output=< Mar 08 06:02:28 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:02:28 crc kubenswrapper[4717]: > Mar 08 06:02:35 crc kubenswrapper[4717]: I0308 06:02:35.949432 4717 scope.go:117] "RemoveContainer" containerID="c3461660eaef6e37145da764f83cacc73e0f6235dba2451c2fd8bdb99819e97f" Mar 08 06:02:37 crc kubenswrapper[4717]: I0308 06:02:37.665382 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:37 crc kubenswrapper[4717]: I0308 06:02:37.730287 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:38 crc kubenswrapper[4717]: I0308 06:02:38.479730 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:38 crc kubenswrapper[4717]: I0308 06:02:38.959463 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6qt5" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" containerID="cri-o://1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580" gracePeriod=2 Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.496521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.632728 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whd4k\" (UniqueName: \"kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k\") pod \"08d8a178-2929-4523-a506-b9e9417a2fae\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.632790 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content\") pod \"08d8a178-2929-4523-a506-b9e9417a2fae\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.633264 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities\") pod \"08d8a178-2929-4523-a506-b9e9417a2fae\" (UID: \"08d8a178-2929-4523-a506-b9e9417a2fae\") " Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.635080 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities" (OuterVolumeSpecName: "utilities") pod "08d8a178-2929-4523-a506-b9e9417a2fae" (UID: "08d8a178-2929-4523-a506-b9e9417a2fae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.643402 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k" (OuterVolumeSpecName: "kube-api-access-whd4k") pod "08d8a178-2929-4523-a506-b9e9417a2fae" (UID: "08d8a178-2929-4523-a506-b9e9417a2fae"). InnerVolumeSpecName "kube-api-access-whd4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.736158 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whd4k\" (UniqueName: \"kubernetes.io/projected/08d8a178-2929-4523-a506-b9e9417a2fae-kube-api-access-whd4k\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.736510 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.811357 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08d8a178-2929-4523-a506-b9e9417a2fae" (UID: "08d8a178-2929-4523-a506-b9e9417a2fae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.839419 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d8a178-2929-4523-a506-b9e9417a2fae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.979888 4717 generic.go:334] "Generic (PLEG): container finished" podID="08d8a178-2929-4523-a506-b9e9417a2fae" containerID="1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580" exitCode=0 Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.979940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerDied","Data":"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580"} Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.979971 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6qt5" event={"ID":"08d8a178-2929-4523-a506-b9e9417a2fae","Type":"ContainerDied","Data":"893a8c8f6d5d7c31e00c94586ac216372393530c682883066b9fe512b20fc93b"} Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.979995 4717 scope.go:117] "RemoveContainer" containerID="1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580" Mar 08 06:02:39 crc kubenswrapper[4717]: I0308 06:02:39.980009 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6qt5" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.023153 4717 scope.go:117] "RemoveContainer" containerID="f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.070842 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.080321 4717 scope.go:117] "RemoveContainer" containerID="ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.081097 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6qt5"] Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.119105 4717 scope.go:117] "RemoveContainer" containerID="1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580" Mar 08 06:02:40 crc kubenswrapper[4717]: E0308 06:02:40.120073 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580\": container with ID starting with 1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580 not found: ID does not exist" containerID="1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.120127 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580"} err="failed to get container status \"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580\": rpc error: code = NotFound desc = could not find container \"1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580\": container with ID starting with 1eff2ed7fd782b9c19044ddeb0f42562e2c37410f0a4bfa57570e7c636b9e580 not found: ID does not exist" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.120160 4717 scope.go:117] "RemoveContainer" containerID="f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8" Mar 08 06:02:40 crc kubenswrapper[4717]: E0308 06:02:40.120639 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8\": container with ID starting with f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8 not found: ID does not exist" containerID="f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.120675 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8"} err="failed to get container status \"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8\": rpc error: code = NotFound desc = could not find container \"f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8\": container with ID starting with f3df26849cf074581c8ec6e193a9d34cc14aab703bf7a99e7b995861897ca9c8 not found: ID does not exist" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.120743 4717 scope.go:117] "RemoveContainer" containerID="ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce" Mar 08 06:02:40 crc kubenswrapper[4717]: E0308 06:02:40.121215 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce\": container with ID starting with ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce not found: ID does not exist" containerID="ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce" Mar 08 06:02:40 crc kubenswrapper[4717]: I0308 06:02:40.121277 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce"} err="failed to get container status \"ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce\": rpc error: code = NotFound desc = could not find container \"ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce\": container with ID starting with ecbda3889d3e9463b3c507b0e9092eb60959f519b0187a643e8b6541f83bc1ce not found: ID does not exist" Mar 08 06:02:41 crc kubenswrapper[4717]: I0308 06:02:41.802504 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" path="/var/lib/kubelet/pods/08d8a178-2929-4523-a506-b9e9417a2fae/volumes" Mar 08 06:02:47 crc kubenswrapper[4717]: I0308 06:02:47.086540 4717 generic.go:334] "Generic (PLEG): container finished" podID="534637bd-8579-46f3-bee7-d6270aa8130c" containerID="9d1d729bb3eb2a531aac971fd442e8cddb8e934ced467d054736c762164a7c93" exitCode=0 Mar 08 06:02:47 crc kubenswrapper[4717]: I0308 06:02:47.086640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" event={"ID":"534637bd-8579-46f3-bee7-d6270aa8130c","Type":"ContainerDied","Data":"9d1d729bb3eb2a531aac971fd442e8cddb8e934ced467d054736c762164a7c93"} Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.630540 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.798614 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799011 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799151 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k2w2\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799209 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799244 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799965 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.799992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800062 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800156 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.800349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle\") pod \"534637bd-8579-46f3-bee7-d6270aa8130c\" (UID: \"534637bd-8579-46f3-bee7-d6270aa8130c\") " Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.805870 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.806864 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.807062 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2" (OuterVolumeSpecName: "kube-api-access-7k2w2") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "kube-api-access-7k2w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.807639 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.807893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.808422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.808560 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.809444 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.810021 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.813605 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.823711 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.823878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.835679 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.848078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory" (OuterVolumeSpecName: "inventory") pod "534637bd-8579-46f3-bee7-d6270aa8130c" (UID: "534637bd-8579-46f3-bee7-d6270aa8130c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903745 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903847 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903861 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903871 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903880 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903891 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903922 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903934 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903944 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k2w2\" (UniqueName: \"kubernetes.io/projected/534637bd-8579-46f3-bee7-d6270aa8130c-kube-api-access-7k2w2\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903954 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903962 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903971 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903979 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:48 crc kubenswrapper[4717]: I0308 06:02:48.903990 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534637bd-8579-46f3-bee7-d6270aa8130c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.052284 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:02:49 crc kubenswrapper[4717]: E0308 06:02:49.052734 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.052755 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" Mar 08 06:02:49 crc kubenswrapper[4717]: E0308 06:02:49.052786 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="extract-content" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.052795 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="extract-content" Mar 08 06:02:49 crc kubenswrapper[4717]: E0308 06:02:49.052828 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="extract-utilities" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.052837 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="extract-utilities" Mar 08 06:02:49 crc kubenswrapper[4717]: E0308 06:02:49.052858 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534637bd-8579-46f3-bee7-d6270aa8130c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.052867 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="534637bd-8579-46f3-bee7-d6270aa8130c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.053060 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d8a178-2929-4523-a506-b9e9417a2fae" containerName="registry-server" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.053089 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="534637bd-8579-46f3-bee7-d6270aa8130c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.054729 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.073964 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.117150 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" event={"ID":"534637bd-8579-46f3-bee7-d6270aa8130c","Type":"ContainerDied","Data":"d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d"} Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.117195 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8923ca980a904cff658ef568b3052c5ff5c644eb5ba8844fce9210a7796b10d" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.117228 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.209208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.209396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.209431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfxc\" (UniqueName: \"kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.248069 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z"] Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.249712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.256387 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.257890 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.257956 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.257911 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.258297 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.266222 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z"] Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.312426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.312835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.312897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfxc\" (UniqueName: \"kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.313181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.313303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.331364 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfxc\" (UniqueName: \"kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc\") pod \"community-operators-mxflk\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.375920 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.415392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.415763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.416053 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.416125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.416163 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn56x\" (UniqueName: \"kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.518187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.518243 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.518345 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.518377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.518402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn56x\" (UniqueName: \"kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.523524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.523716 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.526140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.535027 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.537356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn56x\" (UniqueName: \"kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2c94z\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.573097 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:02:49 crc kubenswrapper[4717]: I0308 06:02:49.916305 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:02:50 crc kubenswrapper[4717]: I0308 06:02:50.129989 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerStarted","Data":"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2"} Mar 08 06:02:50 crc kubenswrapper[4717]: I0308 06:02:50.130354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerStarted","Data":"7728365f9d4a2de675f313f798c2d61479ded0bd776eb9b21ae05b20028053ca"} Mar 08 06:02:50 crc kubenswrapper[4717]: I0308 06:02:50.199091 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z"] Mar 08 06:02:50 crc kubenswrapper[4717]: W0308 06:02:50.207165 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed1792b_78eb_43bf_9e33_276a5b4477f7.slice/crio-0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f WatchSource:0}: Error finding container 0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f: Status 404 returned error can't find the container with id 0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f Mar 08 06:02:51 crc kubenswrapper[4717]: I0308 06:02:51.159650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" event={"ID":"fed1792b-78eb-43bf-9e33-276a5b4477f7","Type":"ContainerStarted","Data":"6309897d4f6db6f9a8c0e14aa564e5bc8f69d9891ae36b10db3a6046da21decf"} Mar 08 06:02:51 crc kubenswrapper[4717]: I0308 06:02:51.160014 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" event={"ID":"fed1792b-78eb-43bf-9e33-276a5b4477f7","Type":"ContainerStarted","Data":"0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f"} Mar 08 06:02:51 crc kubenswrapper[4717]: I0308 06:02:51.167114 4717 generic.go:334] "Generic (PLEG): container finished" podID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerID="0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2" exitCode=0 Mar 08 06:02:51 crc kubenswrapper[4717]: I0308 06:02:51.167179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerDied","Data":"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2"} Mar 08 06:02:51 crc kubenswrapper[4717]: I0308 06:02:51.189500 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" podStartSLOduration=1.666737294 podStartE2EDuration="2.189482262s" podCreationTimestamp="2026-03-08 06:02:49 +0000 UTC" firstStartedPulling="2026-03-08 06:02:50.21016611 +0000 UTC m=+2197.127814994" lastFinishedPulling="2026-03-08 06:02:50.732911118 +0000 UTC m=+2197.650559962" observedRunningTime="2026-03-08 06:02:51.177770674 +0000 UTC m=+2198.095419518" watchObservedRunningTime="2026-03-08 06:02:51.189482262 +0000 UTC m=+2198.107131106" Mar 08 06:02:52 crc kubenswrapper[4717]: I0308 06:02:52.180445 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerStarted","Data":"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3"} Mar 08 06:02:54 crc kubenswrapper[4717]: I0308 06:02:54.200976 4717 generic.go:334] "Generic (PLEG): container finished" podID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerID="b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3" exitCode=0 Mar 08 06:02:54 crc kubenswrapper[4717]: I0308 06:02:54.201050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerDied","Data":"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3"} Mar 08 06:02:55 crc kubenswrapper[4717]: I0308 06:02:55.217621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerStarted","Data":"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161"} Mar 08 06:02:55 crc kubenswrapper[4717]: I0308 06:02:55.252405 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxflk" podStartSLOduration=2.817643678 podStartE2EDuration="6.252379234s" podCreationTimestamp="2026-03-08 06:02:49 +0000 UTC" firstStartedPulling="2026-03-08 06:02:51.170522996 +0000 UTC m=+2198.088171840" lastFinishedPulling="2026-03-08 06:02:54.605258542 +0000 UTC m=+2201.522907396" observedRunningTime="2026-03-08 06:02:55.247328699 +0000 UTC m=+2202.164977573" watchObservedRunningTime="2026-03-08 06:02:55.252379234 +0000 UTC m=+2202.170028118" Mar 08 06:02:59 crc kubenswrapper[4717]: I0308 06:02:59.376713 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:59 crc kubenswrapper[4717]: I0308 06:02:59.377661 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:02:59 crc kubenswrapper[4717]: I0308 06:02:59.458457 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:03:00 crc kubenswrapper[4717]: I0308 06:03:00.360736 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:03:00 crc kubenswrapper[4717]: I0308 06:03:00.435529 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.294040 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxflk" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="registry-server" containerID="cri-o://1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161" gracePeriod=2 Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.834865 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.857929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content\") pod \"11fdc744-ecce-403a-9aaf-fc1767334f98\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.858128 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jfxc\" (UniqueName: \"kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc\") pod \"11fdc744-ecce-403a-9aaf-fc1767334f98\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.858431 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities\") pod \"11fdc744-ecce-403a-9aaf-fc1767334f98\" (UID: \"11fdc744-ecce-403a-9aaf-fc1767334f98\") " Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.859339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities" (OuterVolumeSpecName: "utilities") pod "11fdc744-ecce-403a-9aaf-fc1767334f98" (UID: "11fdc744-ecce-403a-9aaf-fc1767334f98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.860005 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.863585 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc" (OuterVolumeSpecName: "kube-api-access-6jfxc") pod "11fdc744-ecce-403a-9aaf-fc1767334f98" (UID: "11fdc744-ecce-403a-9aaf-fc1767334f98"). InnerVolumeSpecName "kube-api-access-6jfxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.927244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11fdc744-ecce-403a-9aaf-fc1767334f98" (UID: "11fdc744-ecce-403a-9aaf-fc1767334f98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.961543 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11fdc744-ecce-403a-9aaf-fc1767334f98-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:03:02 crc kubenswrapper[4717]: I0308 06:03:02.961580 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jfxc\" (UniqueName: \"kubernetes.io/projected/11fdc744-ecce-403a-9aaf-fc1767334f98-kube-api-access-6jfxc\") on node \"crc\" DevicePath \"\"" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.359230 4717 generic.go:334] "Generic (PLEG): container finished" podID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerID="1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161" exitCode=0 Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.359299 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerDied","Data":"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161"} Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.359854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxflk" event={"ID":"11fdc744-ecce-403a-9aaf-fc1767334f98","Type":"ContainerDied","Data":"7728365f9d4a2de675f313f798c2d61479ded0bd776eb9b21ae05b20028053ca"} Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.359912 4717 scope.go:117] "RemoveContainer" containerID="1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.359368 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxflk" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.414500 4717 scope.go:117] "RemoveContainer" containerID="b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.446134 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.459451 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxflk"] Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.473442 4717 scope.go:117] "RemoveContainer" containerID="0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.502167 4717 scope.go:117] "RemoveContainer" containerID="1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161" Mar 08 06:03:03 crc kubenswrapper[4717]: E0308 06:03:03.502636 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161\": container with ID starting with 1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161 not found: ID does not exist" containerID="1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.502696 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161"} err="failed to get container status \"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161\": rpc error: code = NotFound desc = could not find container \"1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161\": container with ID starting with 1d92e8be19fd2bd81ad9a2a32ec584f93408505260f1079e990c14b4c88c4161 not found: ID does not exist" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.502723 4717 scope.go:117] "RemoveContainer" containerID="b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3" Mar 08 06:03:03 crc kubenswrapper[4717]: E0308 06:03:03.503210 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3\": container with ID starting with b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3 not found: ID does not exist" containerID="b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.503231 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3"} err="failed to get container status \"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3\": rpc error: code = NotFound desc = could not find container \"b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3\": container with ID starting with b44b99c061e5cc264bee2901d9d953b0022c3e2436bb21c76cc6e346cf5784f3 not found: ID does not exist" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.503243 4717 scope.go:117] "RemoveContainer" containerID="0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2" Mar 08 06:03:03 crc kubenswrapper[4717]: E0308 06:03:03.503796 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2\": container with ID starting with 0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2 not found: ID does not exist" containerID="0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.503872 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2"} err="failed to get container status \"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2\": rpc error: code = NotFound desc = could not find container \"0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2\": container with ID starting with 0fe532c5f26e3b6997657deb4abf1e59b038337f8708cdf43d406f72eeb041f2 not found: ID does not exist" Mar 08 06:03:03 crc kubenswrapper[4717]: I0308 06:03:03.798056 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" path="/var/lib/kubelet/pods/11fdc744-ecce-403a-9aaf-fc1767334f98/volumes" Mar 08 06:03:34 crc kubenswrapper[4717]: I0308 06:03:34.119725 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:03:34 crc kubenswrapper[4717]: I0308 06:03:34.120454 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.155202 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549164-b96mc"] Mar 08 06:04:00 crc kubenswrapper[4717]: E0308 06:04:00.156017 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="extract-content" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.156028 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="extract-content" Mar 08 06:04:00 crc kubenswrapper[4717]: E0308 06:04:00.156042 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="registry-server" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.156048 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="registry-server" Mar 08 06:04:00 crc kubenswrapper[4717]: E0308 06:04:00.156076 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="extract-utilities" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.156082 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="extract-utilities" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.156332 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fdc744-ecce-403a-9aaf-fc1767334f98" containerName="registry-server" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.157021 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.158953 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.159528 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.159773 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.178693 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549164-b96mc"] Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.259840 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfr55\" (UniqueName: \"kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55\") pod \"auto-csr-approver-29549164-b96mc\" (UID: \"536d60ba-03ce-49ad-a616-c6919c867a5d\") " pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.361791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfr55\" (UniqueName: \"kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55\") pod \"auto-csr-approver-29549164-b96mc\" (UID: \"536d60ba-03ce-49ad-a616-c6919c867a5d\") " pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.382379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfr55\" (UniqueName: \"kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55\") pod \"auto-csr-approver-29549164-b96mc\" (UID: \"536d60ba-03ce-49ad-a616-c6919c867a5d\") " pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.473509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.956843 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549164-b96mc"] Mar 08 06:04:00 crc kubenswrapper[4717]: I0308 06:04:00.965366 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:04:01 crc kubenswrapper[4717]: I0308 06:04:01.039995 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549164-b96mc" event={"ID":"536d60ba-03ce-49ad-a616-c6919c867a5d","Type":"ContainerStarted","Data":"2609c579d76441ba0f6c362b2173a38b631f8a9721a9dbeff699553ee70c6144"} Mar 08 06:04:03 crc kubenswrapper[4717]: I0308 06:04:03.078153 4717 generic.go:334] "Generic (PLEG): container finished" podID="fed1792b-78eb-43bf-9e33-276a5b4477f7" containerID="6309897d4f6db6f9a8c0e14aa564e5bc8f69d9891ae36b10db3a6046da21decf" exitCode=0 Mar 08 06:04:03 crc kubenswrapper[4717]: I0308 06:04:03.078229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" event={"ID":"fed1792b-78eb-43bf-9e33-276a5b4477f7","Type":"ContainerDied","Data":"6309897d4f6db6f9a8c0e14aa564e5bc8f69d9891ae36b10db3a6046da21decf"} Mar 08 06:04:03 crc kubenswrapper[4717]: I0308 06:04:03.084186 4717 generic.go:334] "Generic (PLEG): container finished" podID="536d60ba-03ce-49ad-a616-c6919c867a5d" containerID="e17fcd41e47af32f07c85689c8ec79d146b04646138dcd2a8d9796a491c167d3" exitCode=0 Mar 08 06:04:03 crc kubenswrapper[4717]: I0308 06:04:03.084229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549164-b96mc" event={"ID":"536d60ba-03ce-49ad-a616-c6919c867a5d","Type":"ContainerDied","Data":"e17fcd41e47af32f07c85689c8ec79d146b04646138dcd2a8d9796a491c167d3"} Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.120280 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.120725 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.639026 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.644366 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753027 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle\") pod \"fed1792b-78eb-43bf-9e33-276a5b4477f7\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753105 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn56x\" (UniqueName: \"kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x\") pod \"fed1792b-78eb-43bf-9e33-276a5b4477f7\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753277 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam\") pod \"fed1792b-78eb-43bf-9e33-276a5b4477f7\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753325 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory\") pod \"fed1792b-78eb-43bf-9e33-276a5b4477f7\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfr55\" (UniqueName: \"kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55\") pod \"536d60ba-03ce-49ad-a616-c6919c867a5d\" (UID: \"536d60ba-03ce-49ad-a616-c6919c867a5d\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.753496 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0\") pod \"fed1792b-78eb-43bf-9e33-276a5b4477f7\" (UID: \"fed1792b-78eb-43bf-9e33-276a5b4477f7\") " Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.759377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fed1792b-78eb-43bf-9e33-276a5b4477f7" (UID: "fed1792b-78eb-43bf-9e33-276a5b4477f7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.769967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55" (OuterVolumeSpecName: "kube-api-access-pfr55") pod "536d60ba-03ce-49ad-a616-c6919c867a5d" (UID: "536d60ba-03ce-49ad-a616-c6919c867a5d"). InnerVolumeSpecName "kube-api-access-pfr55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.797940 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x" (OuterVolumeSpecName: "kube-api-access-sn56x") pod "fed1792b-78eb-43bf-9e33-276a5b4477f7" (UID: "fed1792b-78eb-43bf-9e33-276a5b4477f7"). InnerVolumeSpecName "kube-api-access-sn56x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.798922 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "fed1792b-78eb-43bf-9e33-276a5b4477f7" (UID: "fed1792b-78eb-43bf-9e33-276a5b4477f7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.804851 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory" (OuterVolumeSpecName: "inventory") pod "fed1792b-78eb-43bf-9e33-276a5b4477f7" (UID: "fed1792b-78eb-43bf-9e33-276a5b4477f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.840873 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fed1792b-78eb-43bf-9e33-276a5b4477f7" (UID: "fed1792b-78eb-43bf-9e33-276a5b4477f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857210 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857247 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn56x\" (UniqueName: \"kubernetes.io/projected/fed1792b-78eb-43bf-9e33-276a5b4477f7-kube-api-access-sn56x\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857257 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857266 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fed1792b-78eb-43bf-9e33-276a5b4477f7-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857274 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfr55\" (UniqueName: \"kubernetes.io/projected/536d60ba-03ce-49ad-a616-c6919c867a5d-kube-api-access-pfr55\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:04 crc kubenswrapper[4717]: I0308 06:04:04.857284 4717 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fed1792b-78eb-43bf-9e33-276a5b4477f7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.106339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" event={"ID":"fed1792b-78eb-43bf-9e33-276a5b4477f7","Type":"ContainerDied","Data":"0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f"} Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.106653 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d20634cafa5921585ca091f4975ddeb88e908521e15c3e301763b11b9e4af8f" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.106420 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2c94z" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.108663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549164-b96mc" event={"ID":"536d60ba-03ce-49ad-a616-c6919c867a5d","Type":"ContainerDied","Data":"2609c579d76441ba0f6c362b2173a38b631f8a9721a9dbeff699553ee70c6144"} Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.108721 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2609c579d76441ba0f6c362b2173a38b631f8a9721a9dbeff699553ee70c6144" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.108757 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549164-b96mc" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.217785 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8"] Mar 08 06:04:05 crc kubenswrapper[4717]: E0308 06:04:05.218216 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536d60ba-03ce-49ad-a616-c6919c867a5d" containerName="oc" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.218229 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="536d60ba-03ce-49ad-a616-c6919c867a5d" containerName="oc" Mar 08 06:04:05 crc kubenswrapper[4717]: E0308 06:04:05.218242 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed1792b-78eb-43bf-9e33-276a5b4477f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.218248 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed1792b-78eb-43bf-9e33-276a5b4477f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.218422 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed1792b-78eb-43bf-9e33-276a5b4477f7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.218438 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="536d60ba-03ce-49ad-a616-c6919c867a5d" containerName="oc" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.219046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.221416 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.221641 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.222062 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.222215 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.222355 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.222538 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.226357 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8"] Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264136 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264199 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264274 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264321 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.264407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kp8w\" (UniqueName: \"kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.366970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.367031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.367085 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.367128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.367153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.367203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kp8w\" (UniqueName: \"kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.372239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.373076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.375386 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.377708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.384401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.389499 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kp8w\" (UniqueName: \"kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.538547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.734589 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549158-c56wc"] Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.746951 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549158-c56wc"] Mar 08 06:04:05 crc kubenswrapper[4717]: I0308 06:04:05.791532 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e990088a-bb32-4b49-90c2-f0307d160ae2" path="/var/lib/kubelet/pods/e990088a-bb32-4b49-90c2-f0307d160ae2/volumes" Mar 08 06:04:06 crc kubenswrapper[4717]: I0308 06:04:06.188024 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8"] Mar 08 06:04:06 crc kubenswrapper[4717]: W0308 06:04:06.195919 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podced2f113_4928_44e4_a34a_3ff2a669dec6.slice/crio-1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe WatchSource:0}: Error finding container 1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe: Status 404 returned error can't find the container with id 1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe Mar 08 06:04:07 crc kubenswrapper[4717]: I0308 06:04:07.126475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" event={"ID":"ced2f113-4928-44e4-a34a-3ff2a669dec6","Type":"ContainerStarted","Data":"78fc08dc27ad93e0c7c2141a15f5e10eb3225228b487e64c7fe1e4b96c91ab7d"} Mar 08 06:04:07 crc kubenswrapper[4717]: I0308 06:04:07.127041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" event={"ID":"ced2f113-4928-44e4-a34a-3ff2a669dec6","Type":"ContainerStarted","Data":"1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe"} Mar 08 06:04:07 crc kubenswrapper[4717]: I0308 06:04:07.166714 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" podStartSLOduration=1.681855617 podStartE2EDuration="2.166657218s" podCreationTimestamp="2026-03-08 06:04:05 +0000 UTC" firstStartedPulling="2026-03-08 06:04:06.19761529 +0000 UTC m=+2273.115264134" lastFinishedPulling="2026-03-08 06:04:06.682416881 +0000 UTC m=+2273.600065735" observedRunningTime="2026-03-08 06:04:07.146174633 +0000 UTC m=+2274.063823477" watchObservedRunningTime="2026-03-08 06:04:07.166657218 +0000 UTC m=+2274.084306092" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.120642 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.121235 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.121283 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.121918 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.121976 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" gracePeriod=600 Mar 08 06:04:34 crc kubenswrapper[4717]: E0308 06:04:34.249709 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.441148 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" exitCode=0 Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.441212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e"} Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.441259 4717 scope.go:117] "RemoveContainer" containerID="836fb6ea7382ba7653ac2743e70af5d3bd32623b547700d430e91afed9d0c9da" Mar 08 06:04:34 crc kubenswrapper[4717]: I0308 06:04:34.442322 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:04:34 crc kubenswrapper[4717]: E0308 06:04:34.442880 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:04:36 crc kubenswrapper[4717]: I0308 06:04:36.105930 4717 scope.go:117] "RemoveContainer" containerID="89c1b502f4ec03c921ed3ff7f6d1f3e9fc534bdaf933b86f9e5d2db1a122ed0a" Mar 08 06:04:47 crc kubenswrapper[4717]: I0308 06:04:47.782647 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:04:47 crc kubenswrapper[4717]: E0308 06:04:47.783729 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:04:59 crc kubenswrapper[4717]: I0308 06:04:59.711157 4717 generic.go:334] "Generic (PLEG): container finished" podID="ced2f113-4928-44e4-a34a-3ff2a669dec6" containerID="78fc08dc27ad93e0c7c2141a15f5e10eb3225228b487e64c7fe1e4b96c91ab7d" exitCode=0 Mar 08 06:04:59 crc kubenswrapper[4717]: I0308 06:04:59.711332 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" event={"ID":"ced2f113-4928-44e4-a34a-3ff2a669dec6","Type":"ContainerDied","Data":"78fc08dc27ad93e0c7c2141a15f5e10eb3225228b487e64c7fe1e4b96c91ab7d"} Mar 08 06:04:59 crc kubenswrapper[4717]: I0308 06:04:59.782502 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:04:59 crc kubenswrapper[4717]: E0308 06:04:59.783081 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.246451 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.339257 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.339303 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kp8w\" (UniqueName: \"kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.339427 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.340121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.340176 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.340249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle\") pod \"ced2f113-4928-44e4-a34a-3ff2a669dec6\" (UID: \"ced2f113-4928-44e4-a34a-3ff2a669dec6\") " Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.346925 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w" (OuterVolumeSpecName: "kube-api-access-9kp8w") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "kube-api-access-9kp8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.354631 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.385254 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.386077 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.388063 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.393761 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory" (OuterVolumeSpecName: "inventory") pod "ced2f113-4928-44e4-a34a-3ff2a669dec6" (UID: "ced2f113-4928-44e4-a34a-3ff2a669dec6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442276 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442525 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kp8w\" (UniqueName: \"kubernetes.io/projected/ced2f113-4928-44e4-a34a-3ff2a669dec6-kube-api-access-9kp8w\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442536 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442546 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442554 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.442563 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced2f113-4928-44e4-a34a-3ff2a669dec6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.735878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" event={"ID":"ced2f113-4928-44e4-a34a-3ff2a669dec6","Type":"ContainerDied","Data":"1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe"} Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.735919 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa33ccd55fcfc8ddf513f30224ccf4c2c88947bc90681fd27643970d3a67cfe" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.735973 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.861841 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm"] Mar 08 06:05:01 crc kubenswrapper[4717]: E0308 06:05:01.862519 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced2f113-4928-44e4-a34a-3ff2a669dec6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.862542 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced2f113-4928-44e4-a34a-3ff2a669dec6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.862992 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced2f113-4928-44e4-a34a-3ff2a669dec6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.864225 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.870244 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm"] Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.892970 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.892984 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.893073 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.893084 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.893234 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.953913 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.953958 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.954006 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.954050 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psd5d\" (UniqueName: \"kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:01 crc kubenswrapper[4717]: I0308 06:05:01.954220 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.057247 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.057489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.057534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.057600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.057662 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psd5d\" (UniqueName: \"kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.069888 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.070265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.070827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.073721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.089197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psd5d\" (UniqueName: \"kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.214910 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.599569 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm"] Mar 08 06:05:02 crc kubenswrapper[4717]: I0308 06:05:02.749358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" event={"ID":"a99f6dd1-e80a-4191-b85a-31042a1d9fc0","Type":"ContainerStarted","Data":"7d74be9b84246445c6f4414a75fad5cdd71fd3802c72ed1d8ed5699b2186530e"} Mar 08 06:05:03 crc kubenswrapper[4717]: I0308 06:05:03.765760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" event={"ID":"a99f6dd1-e80a-4191-b85a-31042a1d9fc0","Type":"ContainerStarted","Data":"88635ce00a23777b7ccbc84880eac6013e1d7d23ecef922b0b1c92150cc67d10"} Mar 08 06:05:03 crc kubenswrapper[4717]: I0308 06:05:03.798531 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" podStartSLOduration=2.221057984 podStartE2EDuration="2.798513529s" podCreationTimestamp="2026-03-08 06:05:01 +0000 UTC" firstStartedPulling="2026-03-08 06:05:02.610332569 +0000 UTC m=+2329.527981433" lastFinishedPulling="2026-03-08 06:05:03.187788104 +0000 UTC m=+2330.105436978" observedRunningTime="2026-03-08 06:05:03.796555511 +0000 UTC m=+2330.714204395" watchObservedRunningTime="2026-03-08 06:05:03.798513529 +0000 UTC m=+2330.716162383" Mar 08 06:05:13 crc kubenswrapper[4717]: I0308 06:05:13.792635 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:05:13 crc kubenswrapper[4717]: E0308 06:05:13.793777 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:05:27 crc kubenswrapper[4717]: I0308 06:05:27.782420 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:05:27 crc kubenswrapper[4717]: E0308 06:05:27.783229 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:05:41 crc kubenswrapper[4717]: I0308 06:05:41.782098 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:05:41 crc kubenswrapper[4717]: E0308 06:05:41.783168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:05:53 crc kubenswrapper[4717]: I0308 06:05:53.797978 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:05:53 crc kubenswrapper[4717]: E0308 06:05:53.798933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.172655 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549166-6jxgc"] Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.176029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.179507 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.179559 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.179510 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.190220 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549166-6jxgc"] Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.304124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49qg\" (UniqueName: \"kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg\") pod \"auto-csr-approver-29549166-6jxgc\" (UID: \"f3059b4e-640e-4c71-ab74-8bdb58e351c8\") " pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.406174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49qg\" (UniqueName: \"kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg\") pod \"auto-csr-approver-29549166-6jxgc\" (UID: \"f3059b4e-640e-4c71-ab74-8bdb58e351c8\") " pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.439542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49qg\" (UniqueName: \"kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg\") pod \"auto-csr-approver-29549166-6jxgc\" (UID: \"f3059b4e-640e-4c71-ab74-8bdb58e351c8\") " pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.500931 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:00 crc kubenswrapper[4717]: W0308 06:06:00.993959 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3059b4e_640e_4c71_ab74_8bdb58e351c8.slice/crio-a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea WatchSource:0}: Error finding container a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea: Status 404 returned error can't find the container with id a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea Mar 08 06:06:00 crc kubenswrapper[4717]: I0308 06:06:00.994673 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549166-6jxgc"] Mar 08 06:06:01 crc kubenswrapper[4717]: I0308 06:06:01.442291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" event={"ID":"f3059b4e-640e-4c71-ab74-8bdb58e351c8","Type":"ContainerStarted","Data":"a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea"} Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.050584 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.053136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.064591 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.149705 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.149824 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.149860 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm8m8\" (UniqueName: \"kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.251860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.251926 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm8m8\" (UniqueName: \"kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.252084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.252440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.252634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.272388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm8m8\" (UniqueName: \"kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8\") pod \"certified-operators-wqvcs\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.392344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.455249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" event={"ID":"f3059b4e-640e-4c71-ab74-8bdb58e351c8","Type":"ContainerStarted","Data":"32aed8448faaf9f858031c040492d3615d92404109e27c8435550707740a7ec8"} Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.472221 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" podStartSLOduration=1.692685477 podStartE2EDuration="2.472204742s" podCreationTimestamp="2026-03-08 06:06:00 +0000 UTC" firstStartedPulling="2026-03-08 06:06:00.998410772 +0000 UTC m=+2387.916059646" lastFinishedPulling="2026-03-08 06:06:01.777930037 +0000 UTC m=+2388.695578911" observedRunningTime="2026-03-08 06:06:02.469833533 +0000 UTC m=+2389.387482377" watchObservedRunningTime="2026-03-08 06:06:02.472204742 +0000 UTC m=+2389.389853586" Mar 08 06:06:02 crc kubenswrapper[4717]: I0308 06:06:02.868139 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:03 crc kubenswrapper[4717]: I0308 06:06:03.464544 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3059b4e-640e-4c71-ab74-8bdb58e351c8" containerID="32aed8448faaf9f858031c040492d3615d92404109e27c8435550707740a7ec8" exitCode=0 Mar 08 06:06:03 crc kubenswrapper[4717]: I0308 06:06:03.464636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" event={"ID":"f3059b4e-640e-4c71-ab74-8bdb58e351c8","Type":"ContainerDied","Data":"32aed8448faaf9f858031c040492d3615d92404109e27c8435550707740a7ec8"} Mar 08 06:06:03 crc kubenswrapper[4717]: I0308 06:06:03.466281 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb50a796-4884-4d64-b54b-1821576399a8" containerID="5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413" exitCode=0 Mar 08 06:06:03 crc kubenswrapper[4717]: I0308 06:06:03.466333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerDied","Data":"5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413"} Mar 08 06:06:03 crc kubenswrapper[4717]: I0308 06:06:03.466395 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerStarted","Data":"530c7ef6be7d3aa4e616040f5a46e0003ffde900a5282201dbc8c79a73473e8d"} Mar 08 06:06:04 crc kubenswrapper[4717]: I0308 06:06:04.478320 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerStarted","Data":"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d"} Mar 08 06:06:04 crc kubenswrapper[4717]: I0308 06:06:04.859888 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.007594 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49qg\" (UniqueName: \"kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg\") pod \"f3059b4e-640e-4c71-ab74-8bdb58e351c8\" (UID: \"f3059b4e-640e-4c71-ab74-8bdb58e351c8\") " Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.016008 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg" (OuterVolumeSpecName: "kube-api-access-b49qg") pod "f3059b4e-640e-4c71-ab74-8bdb58e351c8" (UID: "f3059b4e-640e-4c71-ab74-8bdb58e351c8"). InnerVolumeSpecName "kube-api-access-b49qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.109260 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49qg\" (UniqueName: \"kubernetes.io/projected/f3059b4e-640e-4c71-ab74-8bdb58e351c8-kube-api-access-b49qg\") on node \"crc\" DevicePath \"\"" Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.489342 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.489358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549166-6jxgc" event={"ID":"f3059b4e-640e-4c71-ab74-8bdb58e351c8","Type":"ContainerDied","Data":"a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea"} Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.490808 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a057c1346b33ff477c14d9a7d4f88d3da7ad46fc4aa03af18e85849cce795bea" Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.572153 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549160-627gs"] Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.579903 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549160-627gs"] Mar 08 06:06:05 crc kubenswrapper[4717]: I0308 06:06:05.793972 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d83359-ba6f-45da-9eee-f3348558e8cb" path="/var/lib/kubelet/pods/28d83359-ba6f-45da-9eee-f3348558e8cb/volumes" Mar 08 06:06:06 crc kubenswrapper[4717]: I0308 06:06:06.503798 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb50a796-4884-4d64-b54b-1821576399a8" containerID="58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d" exitCode=0 Mar 08 06:06:06 crc kubenswrapper[4717]: I0308 06:06:06.503867 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerDied","Data":"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d"} Mar 08 06:06:06 crc kubenswrapper[4717]: I0308 06:06:06.782245 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:06:06 crc kubenswrapper[4717]: E0308 06:06:06.782754 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:06:07 crc kubenswrapper[4717]: I0308 06:06:07.514991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerStarted","Data":"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59"} Mar 08 06:06:07 crc kubenswrapper[4717]: I0308 06:06:07.542670 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wqvcs" podStartSLOduration=3.113086761 podStartE2EDuration="6.542650212s" podCreationTimestamp="2026-03-08 06:06:01 +0000 UTC" firstStartedPulling="2026-03-08 06:06:03.468156313 +0000 UTC m=+2390.385805157" lastFinishedPulling="2026-03-08 06:06:06.897719724 +0000 UTC m=+2393.815368608" observedRunningTime="2026-03-08 06:06:07.539825472 +0000 UTC m=+2394.457474326" watchObservedRunningTime="2026-03-08 06:06:07.542650212 +0000 UTC m=+2394.460299066" Mar 08 06:06:12 crc kubenswrapper[4717]: I0308 06:06:12.393419 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:12 crc kubenswrapper[4717]: I0308 06:06:12.394244 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:12 crc kubenswrapper[4717]: I0308 06:06:12.488655 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:12 crc kubenswrapper[4717]: I0308 06:06:12.659261 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:12 crc kubenswrapper[4717]: I0308 06:06:12.743658 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:14 crc kubenswrapper[4717]: I0308 06:06:14.591180 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wqvcs" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="registry-server" containerID="cri-o://a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59" gracePeriod=2 Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.067518 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.151960 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm8m8\" (UniqueName: \"kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8\") pod \"eb50a796-4884-4d64-b54b-1821576399a8\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.152025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content\") pod \"eb50a796-4884-4d64-b54b-1821576399a8\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.152598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities\") pod \"eb50a796-4884-4d64-b54b-1821576399a8\" (UID: \"eb50a796-4884-4d64-b54b-1821576399a8\") " Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.153247 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities" (OuterVolumeSpecName: "utilities") pod "eb50a796-4884-4d64-b54b-1821576399a8" (UID: "eb50a796-4884-4d64-b54b-1821576399a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.153786 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.161662 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8" (OuterVolumeSpecName: "kube-api-access-zm8m8") pod "eb50a796-4884-4d64-b54b-1821576399a8" (UID: "eb50a796-4884-4d64-b54b-1821576399a8"). InnerVolumeSpecName "kube-api-access-zm8m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.233292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb50a796-4884-4d64-b54b-1821576399a8" (UID: "eb50a796-4884-4d64-b54b-1821576399a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.255992 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm8m8\" (UniqueName: \"kubernetes.io/projected/eb50a796-4884-4d64-b54b-1821576399a8-kube-api-access-zm8m8\") on node \"crc\" DevicePath \"\"" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.256039 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb50a796-4884-4d64-b54b-1821576399a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.608472 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb50a796-4884-4d64-b54b-1821576399a8" containerID="a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59" exitCode=0 Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.608531 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerDied","Data":"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59"} Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.608575 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqvcs" event={"ID":"eb50a796-4884-4d64-b54b-1821576399a8","Type":"ContainerDied","Data":"530c7ef6be7d3aa4e616040f5a46e0003ffde900a5282201dbc8c79a73473e8d"} Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.608595 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqvcs" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.608606 4717 scope.go:117] "RemoveContainer" containerID="a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.673384 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.678420 4717 scope.go:117] "RemoveContainer" containerID="58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.685073 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wqvcs"] Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.721346 4717 scope.go:117] "RemoveContainer" containerID="5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.777714 4717 scope.go:117] "RemoveContainer" containerID="a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59" Mar 08 06:06:15 crc kubenswrapper[4717]: E0308 06:06:15.778141 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59\": container with ID starting with a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59 not found: ID does not exist" containerID="a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.778174 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59"} err="failed to get container status \"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59\": rpc error: code = NotFound desc = could not find container \"a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59\": container with ID starting with a3a42243c7823d9df014d517b83c46914d07dafcf3692b18e57b4c56f5a89c59 not found: ID does not exist" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.778192 4717 scope.go:117] "RemoveContainer" containerID="58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d" Mar 08 06:06:15 crc kubenswrapper[4717]: E0308 06:06:15.778504 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d\": container with ID starting with 58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d not found: ID does not exist" containerID="58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.778525 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d"} err="failed to get container status \"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d\": rpc error: code = NotFound desc = could not find container \"58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d\": container with ID starting with 58c2ba3f7999048800ad3753e7a04026e164779f435cd4490e6102da43105c4d not found: ID does not exist" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.778540 4717 scope.go:117] "RemoveContainer" containerID="5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413" Mar 08 06:06:15 crc kubenswrapper[4717]: E0308 06:06:15.778791 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413\": container with ID starting with 5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413 not found: ID does not exist" containerID="5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.778816 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413"} err="failed to get container status \"5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413\": rpc error: code = NotFound desc = could not find container \"5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413\": container with ID starting with 5bd4bf144335eea156a97a41627b50f5aeec0ca8ea09ad5552c50c8fde69d413 not found: ID does not exist" Mar 08 06:06:15 crc kubenswrapper[4717]: I0308 06:06:15.796022 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb50a796-4884-4d64-b54b-1821576399a8" path="/var/lib/kubelet/pods/eb50a796-4884-4d64-b54b-1821576399a8/volumes" Mar 08 06:06:17 crc kubenswrapper[4717]: I0308 06:06:17.782492 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:06:17 crc kubenswrapper[4717]: E0308 06:06:17.784148 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:06:30 crc kubenswrapper[4717]: I0308 06:06:30.781954 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:06:30 crc kubenswrapper[4717]: E0308 06:06:30.783283 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:06:36 crc kubenswrapper[4717]: I0308 06:06:36.268876 4717 scope.go:117] "RemoveContainer" containerID="30a76715bd60145e1352b51bd910145b766498ba839576ef8681ec20147c1d6f" Mar 08 06:06:41 crc kubenswrapper[4717]: I0308 06:06:41.782351 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:06:41 crc kubenswrapper[4717]: E0308 06:06:41.783456 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:06:56 crc kubenswrapper[4717]: I0308 06:06:56.782731 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:06:56 crc kubenswrapper[4717]: E0308 06:06:56.784660 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:07:11 crc kubenswrapper[4717]: I0308 06:07:11.781950 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:07:11 crc kubenswrapper[4717]: E0308 06:07:11.782816 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:07:25 crc kubenswrapper[4717]: I0308 06:07:25.782296 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:07:25 crc kubenswrapper[4717]: E0308 06:07:25.783019 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:07:38 crc kubenswrapper[4717]: I0308 06:07:38.782536 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:07:38 crc kubenswrapper[4717]: E0308 06:07:38.783876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:07:45 crc kubenswrapper[4717]: I0308 06:07:45.687675 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-665f758875-jsp86" podUID="7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 08 06:07:53 crc kubenswrapper[4717]: I0308 06:07:53.794810 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:07:53 crc kubenswrapper[4717]: E0308 06:07:53.796023 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.166895 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549168-4s6cc"] Mar 08 06:08:00 crc kubenswrapper[4717]: E0308 06:08:00.168080 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="extract-content" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168101 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="extract-content" Mar 08 06:08:00 crc kubenswrapper[4717]: E0308 06:08:00.168138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="registry-server" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168152 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="registry-server" Mar 08 06:08:00 crc kubenswrapper[4717]: E0308 06:08:00.168205 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3059b4e-640e-4c71-ab74-8bdb58e351c8" containerName="oc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168219 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3059b4e-640e-4c71-ab74-8bdb58e351c8" containerName="oc" Mar 08 06:08:00 crc kubenswrapper[4717]: E0308 06:08:00.168251 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="extract-utilities" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168263 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="extract-utilities" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168573 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3059b4e-640e-4c71-ab74-8bdb58e351c8" containerName="oc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.168594 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb50a796-4884-4d64-b54b-1821576399a8" containerName="registry-server" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.170082 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.184260 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.184940 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.185397 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.194301 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549168-4s6cc"] Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.355705 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frj2j\" (UniqueName: \"kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j\") pod \"auto-csr-approver-29549168-4s6cc\" (UID: \"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0\") " pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.458167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frj2j\" (UniqueName: \"kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j\") pod \"auto-csr-approver-29549168-4s6cc\" (UID: \"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0\") " pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.495029 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frj2j\" (UniqueName: \"kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j\") pod \"auto-csr-approver-29549168-4s6cc\" (UID: \"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0\") " pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:00 crc kubenswrapper[4717]: I0308 06:08:00.522431 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:01 crc kubenswrapper[4717]: I0308 06:08:01.022723 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549168-4s6cc"] Mar 08 06:08:01 crc kubenswrapper[4717]: I0308 06:08:01.938472 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" event={"ID":"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0","Type":"ContainerStarted","Data":"b8d27d48eafebda4fce2e51506599a67df1b2e4f5d7026d2c1ebf101bcd4f62f"} Mar 08 06:08:02 crc kubenswrapper[4717]: I0308 06:08:02.951934 4717 generic.go:334] "Generic (PLEG): container finished" podID="08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" containerID="197e6c9ec9f5420e8776698a1747a40ae104f0ac9efc8eac613a986d9875eba9" exitCode=0 Mar 08 06:08:02 crc kubenswrapper[4717]: I0308 06:08:02.952062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" event={"ID":"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0","Type":"ContainerDied","Data":"197e6c9ec9f5420e8776698a1747a40ae104f0ac9efc8eac613a986d9875eba9"} Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.380427 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.548598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frj2j\" (UniqueName: \"kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j\") pod \"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0\" (UID: \"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0\") " Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.560971 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j" (OuterVolumeSpecName: "kube-api-access-frj2j") pod "08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" (UID: "08d3b3ca-681b-4df7-8be4-10fc8b68d5d0"). InnerVolumeSpecName "kube-api-access-frj2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.651508 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frj2j\" (UniqueName: \"kubernetes.io/projected/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0-kube-api-access-frj2j\") on node \"crc\" DevicePath \"\"" Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.971749 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" event={"ID":"08d3b3ca-681b-4df7-8be4-10fc8b68d5d0","Type":"ContainerDied","Data":"b8d27d48eafebda4fce2e51506599a67df1b2e4f5d7026d2c1ebf101bcd4f62f"} Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.971788 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549168-4s6cc" Mar 08 06:08:04 crc kubenswrapper[4717]: I0308 06:08:04.971792 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8d27d48eafebda4fce2e51506599a67df1b2e4f5d7026d2c1ebf101bcd4f62f" Mar 08 06:08:05 crc kubenswrapper[4717]: I0308 06:08:05.457929 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549162-lxg8r"] Mar 08 06:08:05 crc kubenswrapper[4717]: I0308 06:08:05.470983 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549162-lxg8r"] Mar 08 06:08:05 crc kubenswrapper[4717]: I0308 06:08:05.812331 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42305074-feb7-40c9-91c1-e9a34b11a20b" path="/var/lib/kubelet/pods/42305074-feb7-40c9-91c1-e9a34b11a20b/volumes" Mar 08 06:08:08 crc kubenswrapper[4717]: I0308 06:08:08.781971 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:08:08 crc kubenswrapper[4717]: E0308 06:08:08.782575 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:08:21 crc kubenswrapper[4717]: I0308 06:08:21.782924 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:08:21 crc kubenswrapper[4717]: E0308 06:08:21.784170 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:08:36 crc kubenswrapper[4717]: I0308 06:08:36.441229 4717 scope.go:117] "RemoveContainer" containerID="4305ce0069088ea09abb95ef5fb0eeb989a57c756ec7545c137c3e3256942c5d" Mar 08 06:08:36 crc kubenswrapper[4717]: I0308 06:08:36.781790 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:08:36 crc kubenswrapper[4717]: E0308 06:08:36.782540 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:08:51 crc kubenswrapper[4717]: I0308 06:08:51.781629 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:08:51 crc kubenswrapper[4717]: E0308 06:08:51.783113 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:09:03 crc kubenswrapper[4717]: I0308 06:09:03.787214 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:09:03 crc kubenswrapper[4717]: E0308 06:09:03.788015 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:09:06 crc kubenswrapper[4717]: I0308 06:09:06.701293 4717 generic.go:334] "Generic (PLEG): container finished" podID="a99f6dd1-e80a-4191-b85a-31042a1d9fc0" containerID="88635ce00a23777b7ccbc84880eac6013e1d7d23ecef922b0b1c92150cc67d10" exitCode=0 Mar 08 06:09:06 crc kubenswrapper[4717]: I0308 06:09:06.701758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" event={"ID":"a99f6dd1-e80a-4191-b85a-31042a1d9fc0","Type":"ContainerDied","Data":"88635ce00a23777b7ccbc84880eac6013e1d7d23ecef922b0b1c92150cc67d10"} Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.221490 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.284861 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0\") pod \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.284992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psd5d\" (UniqueName: \"kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d\") pod \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.285028 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory\") pod \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.285092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam\") pod \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.285335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle\") pod \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\" (UID: \"a99f6dd1-e80a-4191-b85a-31042a1d9fc0\") " Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.302721 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a99f6dd1-e80a-4191-b85a-31042a1d9fc0" (UID: "a99f6dd1-e80a-4191-b85a-31042a1d9fc0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.304971 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d" (OuterVolumeSpecName: "kube-api-access-psd5d") pod "a99f6dd1-e80a-4191-b85a-31042a1d9fc0" (UID: "a99f6dd1-e80a-4191-b85a-31042a1d9fc0"). InnerVolumeSpecName "kube-api-access-psd5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.317862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a99f6dd1-e80a-4191-b85a-31042a1d9fc0" (UID: "a99f6dd1-e80a-4191-b85a-31042a1d9fc0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.325870 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory" (OuterVolumeSpecName: "inventory") pod "a99f6dd1-e80a-4191-b85a-31042a1d9fc0" (UID: "a99f6dd1-e80a-4191-b85a-31042a1d9fc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.343370 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a99f6dd1-e80a-4191-b85a-31042a1d9fc0" (UID: "a99f6dd1-e80a-4191-b85a-31042a1d9fc0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.389151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psd5d\" (UniqueName: \"kubernetes.io/projected/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-kube-api-access-psd5d\") on node \"crc\" DevicePath \"\"" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.389189 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.389203 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.389218 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.389232 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a99f6dd1-e80a-4191-b85a-31042a1d9fc0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.726147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" event={"ID":"a99f6dd1-e80a-4191-b85a-31042a1d9fc0","Type":"ContainerDied","Data":"7d74be9b84246445c6f4414a75fad5cdd71fd3802c72ed1d8ed5699b2186530e"} Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.726208 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d74be9b84246445c6f4414a75fad5cdd71fd3802c72ed1d8ed5699b2186530e" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.726231 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.849762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m"] Mar 08 06:09:08 crc kubenswrapper[4717]: E0308 06:09:08.850261 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" containerName="oc" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.850284 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" containerName="oc" Mar 08 06:09:08 crc kubenswrapper[4717]: E0308 06:09:08.850305 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f6dd1-e80a-4191-b85a-31042a1d9fc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.850314 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f6dd1-e80a-4191-b85a-31042a1d9fc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.850579 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" containerName="oc" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.850608 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99f6dd1-e80a-4191-b85a-31042a1d9fc0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.851436 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.854619 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.855093 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.855538 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.855918 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.856631 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.857012 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.857485 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.869857 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m"] Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.899873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900417 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhlc\" (UniqueName: \"kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900452 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900525 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900555 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900730 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:08 crc kubenswrapper[4717]: I0308 06:09:08.900765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhlc\" (UniqueName: \"kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003649 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.003960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.004057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.004139 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.007437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.007766 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.009814 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.012119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.016268 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.016788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.022047 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.022152 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.023730 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.031844 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.037415 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhlc\" (UniqueName: \"kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rct9m\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.186195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.755350 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m"] Mar 08 06:09:09 crc kubenswrapper[4717]: I0308 06:09:09.772560 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:09:10 crc kubenswrapper[4717]: I0308 06:09:10.752637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" event={"ID":"adf01f26-1066-4901-aa10-cd145a720cd6","Type":"ContainerStarted","Data":"51b3d6cddbdf18d599af61a3e52f6c903620687aac43cb3512d882b26abe0c32"} Mar 08 06:09:10 crc kubenswrapper[4717]: I0308 06:09:10.753018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" event={"ID":"adf01f26-1066-4901-aa10-cd145a720cd6","Type":"ContainerStarted","Data":"491bf23d87ea63e60ddce258db791b8b4b605e3f13fc190d23daed38162207f7"} Mar 08 06:09:10 crc kubenswrapper[4717]: I0308 06:09:10.785317 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" podStartSLOduration=2.332298238 podStartE2EDuration="2.785290699s" podCreationTimestamp="2026-03-08 06:09:08 +0000 UTC" firstStartedPulling="2026-03-08 06:09:09.772282041 +0000 UTC m=+2576.689930895" lastFinishedPulling="2026-03-08 06:09:10.225274512 +0000 UTC m=+2577.142923356" observedRunningTime="2026-03-08 06:09:10.782783878 +0000 UTC m=+2577.700432752" watchObservedRunningTime="2026-03-08 06:09:10.785290699 +0000 UTC m=+2577.702939583" Mar 08 06:09:16 crc kubenswrapper[4717]: I0308 06:09:16.782872 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:09:16 crc kubenswrapper[4717]: E0308 06:09:16.783532 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:09:29 crc kubenswrapper[4717]: I0308 06:09:29.782126 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:09:29 crc kubenswrapper[4717]: E0308 06:09:29.783315 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:09:43 crc kubenswrapper[4717]: I0308 06:09:43.794845 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:09:44 crc kubenswrapper[4717]: I0308 06:09:44.120711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38"} Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.149832 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549170-5jt9q"] Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.152446 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.155458 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.155662 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.161139 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.184302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549170-5jt9q"] Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.338074 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572sl\" (UniqueName: \"kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl\") pod \"auto-csr-approver-29549170-5jt9q\" (UID: \"2f8e8e22-0684-4303-9add-a77e75b09d31\") " pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.440561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572sl\" (UniqueName: \"kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl\") pod \"auto-csr-approver-29549170-5jt9q\" (UID: \"2f8e8e22-0684-4303-9add-a77e75b09d31\") " pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.460562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572sl\" (UniqueName: \"kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl\") pod \"auto-csr-approver-29549170-5jt9q\" (UID: \"2f8e8e22-0684-4303-9add-a77e75b09d31\") " pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.484516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:00 crc kubenswrapper[4717]: I0308 06:10:00.780870 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549170-5jt9q"] Mar 08 06:10:01 crc kubenswrapper[4717]: I0308 06:10:01.346916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" event={"ID":"2f8e8e22-0684-4303-9add-a77e75b09d31","Type":"ContainerStarted","Data":"54403c79be7e87eb178fed1692700f7e0bbc5300622d225b02e6f0d79479a384"} Mar 08 06:10:02 crc kubenswrapper[4717]: I0308 06:10:02.361708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" event={"ID":"2f8e8e22-0684-4303-9add-a77e75b09d31","Type":"ContainerStarted","Data":"c7923a259b2780e72f00842b87a806d77ce97d691d1474751182905e9c1f57dd"} Mar 08 06:10:02 crc kubenswrapper[4717]: I0308 06:10:02.395354 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" podStartSLOduration=1.312655882 podStartE2EDuration="2.395331019s" podCreationTimestamp="2026-03-08 06:10:00 +0000 UTC" firstStartedPulling="2026-03-08 06:10:00.795424305 +0000 UTC m=+2627.713073159" lastFinishedPulling="2026-03-08 06:10:01.878099452 +0000 UTC m=+2628.795748296" observedRunningTime="2026-03-08 06:10:02.383826987 +0000 UTC m=+2629.301475831" watchObservedRunningTime="2026-03-08 06:10:02.395331019 +0000 UTC m=+2629.312979873" Mar 08 06:10:03 crc kubenswrapper[4717]: I0308 06:10:03.384660 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f8e8e22-0684-4303-9add-a77e75b09d31" containerID="c7923a259b2780e72f00842b87a806d77ce97d691d1474751182905e9c1f57dd" exitCode=0 Mar 08 06:10:03 crc kubenswrapper[4717]: I0308 06:10:03.384735 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" event={"ID":"2f8e8e22-0684-4303-9add-a77e75b09d31","Type":"ContainerDied","Data":"c7923a259b2780e72f00842b87a806d77ce97d691d1474751182905e9c1f57dd"} Mar 08 06:10:04 crc kubenswrapper[4717]: I0308 06:10:04.851609 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:04 crc kubenswrapper[4717]: I0308 06:10:04.943435 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572sl\" (UniqueName: \"kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl\") pod \"2f8e8e22-0684-4303-9add-a77e75b09d31\" (UID: \"2f8e8e22-0684-4303-9add-a77e75b09d31\") " Mar 08 06:10:04 crc kubenswrapper[4717]: I0308 06:10:04.966077 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl" (OuterVolumeSpecName: "kube-api-access-572sl") pod "2f8e8e22-0684-4303-9add-a77e75b09d31" (UID: "2f8e8e22-0684-4303-9add-a77e75b09d31"). InnerVolumeSpecName "kube-api-access-572sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.046775 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572sl\" (UniqueName: \"kubernetes.io/projected/2f8e8e22-0684-4303-9add-a77e75b09d31-kube-api-access-572sl\") on node \"crc\" DevicePath \"\"" Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.412600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" event={"ID":"2f8e8e22-0684-4303-9add-a77e75b09d31","Type":"ContainerDied","Data":"54403c79be7e87eb178fed1692700f7e0bbc5300622d225b02e6f0d79479a384"} Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.412659 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54403c79be7e87eb178fed1692700f7e0bbc5300622d225b02e6f0d79479a384" Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.412738 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549170-5jt9q" Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.496220 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549164-b96mc"] Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.507500 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549164-b96mc"] Mar 08 06:10:05 crc kubenswrapper[4717]: I0308 06:10:05.804005 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536d60ba-03ce-49ad-a616-c6919c867a5d" path="/var/lib/kubelet/pods/536d60ba-03ce-49ad-a616-c6919c867a5d/volumes" Mar 08 06:10:36 crc kubenswrapper[4717]: I0308 06:10:36.567230 4717 scope.go:117] "RemoveContainer" containerID="e17fcd41e47af32f07c85689c8ec79d146b04646138dcd2a8d9796a491c167d3" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.063697 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:10:50 crc kubenswrapper[4717]: E0308 06:10:50.064729 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8e8e22-0684-4303-9add-a77e75b09d31" containerName="oc" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.064744 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8e8e22-0684-4303-9add-a77e75b09d31" containerName="oc" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.065006 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8e8e22-0684-4303-9add-a77e75b09d31" containerName="oc" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.067067 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.080213 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.261312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.261445 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbmj\" (UniqueName: \"kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.262051 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.363536 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.363648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.363723 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbmj\" (UniqueName: \"kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.364206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.364213 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.392516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbmj\" (UniqueName: \"kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj\") pod \"redhat-marketplace-lnx6h\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.401674 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.852870 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:10:50 crc kubenswrapper[4717]: I0308 06:10:50.936738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerStarted","Data":"fadea6bf5da842131cb12a9e113f6a1a518c6020ced78d6a16f23aacbd2be159"} Mar 08 06:10:51 crc kubenswrapper[4717]: I0308 06:10:51.951017 4717 generic.go:334] "Generic (PLEG): container finished" podID="22078090-20eb-466c-a639-3e9206d4f209" containerID="4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae" exitCode=0 Mar 08 06:10:51 crc kubenswrapper[4717]: I0308 06:10:51.951084 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerDied","Data":"4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae"} Mar 08 06:10:52 crc kubenswrapper[4717]: I0308 06:10:52.962031 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerStarted","Data":"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f"} Mar 08 06:10:53 crc kubenswrapper[4717]: I0308 06:10:53.973859 4717 generic.go:334] "Generic (PLEG): container finished" podID="22078090-20eb-466c-a639-3e9206d4f209" containerID="2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f" exitCode=0 Mar 08 06:10:53 crc kubenswrapper[4717]: I0308 06:10:53.973946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerDied","Data":"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f"} Mar 08 06:10:54 crc kubenswrapper[4717]: I0308 06:10:54.984737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerStarted","Data":"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679"} Mar 08 06:10:55 crc kubenswrapper[4717]: I0308 06:10:55.008127 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lnx6h" podStartSLOduration=2.561434841 podStartE2EDuration="5.008103437s" podCreationTimestamp="2026-03-08 06:10:50 +0000 UTC" firstStartedPulling="2026-03-08 06:10:51.955100817 +0000 UTC m=+2678.872749661" lastFinishedPulling="2026-03-08 06:10:54.401769403 +0000 UTC m=+2681.319418257" observedRunningTime="2026-03-08 06:10:55.002826637 +0000 UTC m=+2681.920475491" watchObservedRunningTime="2026-03-08 06:10:55.008103437 +0000 UTC m=+2681.925752291" Mar 08 06:11:00 crc kubenswrapper[4717]: I0308 06:11:00.401843 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:00 crc kubenswrapper[4717]: I0308 06:11:00.402358 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:00 crc kubenswrapper[4717]: I0308 06:11:00.466950 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:01 crc kubenswrapper[4717]: I0308 06:11:01.137272 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:01 crc kubenswrapper[4717]: I0308 06:11:01.203060 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.103014 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lnx6h" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="registry-server" containerID="cri-o://d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679" gracePeriod=2 Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.652973 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.748249 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content\") pod \"22078090-20eb-466c-a639-3e9206d4f209\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.748408 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cbmj\" (UniqueName: \"kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj\") pod \"22078090-20eb-466c-a639-3e9206d4f209\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.749750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities\") pod \"22078090-20eb-466c-a639-3e9206d4f209\" (UID: \"22078090-20eb-466c-a639-3e9206d4f209\") " Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.750783 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities" (OuterVolumeSpecName: "utilities") pod "22078090-20eb-466c-a639-3e9206d4f209" (UID: "22078090-20eb-466c-a639-3e9206d4f209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.754978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj" (OuterVolumeSpecName: "kube-api-access-6cbmj") pod "22078090-20eb-466c-a639-3e9206d4f209" (UID: "22078090-20eb-466c-a639-3e9206d4f209"). InnerVolumeSpecName "kube-api-access-6cbmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.782676 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22078090-20eb-466c-a639-3e9206d4f209" (UID: "22078090-20eb-466c-a639-3e9206d4f209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.852588 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.852631 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22078090-20eb-466c-a639-3e9206d4f209-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:03 crc kubenswrapper[4717]: I0308 06:11:03.852645 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cbmj\" (UniqueName: \"kubernetes.io/projected/22078090-20eb-466c-a639-3e9206d4f209-kube-api-access-6cbmj\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.110734 4717 generic.go:334] "Generic (PLEG): container finished" podID="22078090-20eb-466c-a639-3e9206d4f209" containerID="d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679" exitCode=0 Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.110775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerDied","Data":"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679"} Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.110800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnx6h" event={"ID":"22078090-20eb-466c-a639-3e9206d4f209","Type":"ContainerDied","Data":"fadea6bf5da842131cb12a9e113f6a1a518c6020ced78d6a16f23aacbd2be159"} Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.110819 4717 scope.go:117] "RemoveContainer" containerID="d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.110948 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnx6h" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.144120 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.146422 4717 scope.go:117] "RemoveContainer" containerID="2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.158751 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnx6h"] Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.163537 4717 scope.go:117] "RemoveContainer" containerID="4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.202429 4717 scope.go:117] "RemoveContainer" containerID="d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679" Mar 08 06:11:04 crc kubenswrapper[4717]: E0308 06:11:04.202794 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679\": container with ID starting with d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679 not found: ID does not exist" containerID="d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.202820 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679"} err="failed to get container status \"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679\": rpc error: code = NotFound desc = could not find container \"d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679\": container with ID starting with d05a729cc0f30f9fe037915f1431b36814122bcf3f9cd1605e8eb09b14f0d679 not found: ID does not exist" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.202840 4717 scope.go:117] "RemoveContainer" containerID="2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f" Mar 08 06:11:04 crc kubenswrapper[4717]: E0308 06:11:04.203229 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f\": container with ID starting with 2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f not found: ID does not exist" containerID="2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.203254 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f"} err="failed to get container status \"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f\": rpc error: code = NotFound desc = could not find container \"2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f\": container with ID starting with 2f8cb896ac71b862c4c8b1cf2f27509b85cfb2bc5a8831a0575a165a29711f1f not found: ID does not exist" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.203269 4717 scope.go:117] "RemoveContainer" containerID="4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae" Mar 08 06:11:04 crc kubenswrapper[4717]: E0308 06:11:04.203755 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae\": container with ID starting with 4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae not found: ID does not exist" containerID="4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae" Mar 08 06:11:04 crc kubenswrapper[4717]: I0308 06:11:04.203800 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae"} err="failed to get container status \"4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae\": rpc error: code = NotFound desc = could not find container \"4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae\": container with ID starting with 4ea916b3073219fa07d52dcb814ed16741b8b89712852cbd1b59d501cf2774ae not found: ID does not exist" Mar 08 06:11:05 crc kubenswrapper[4717]: I0308 06:11:05.799394 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22078090-20eb-466c-a639-3e9206d4f209" path="/var/lib/kubelet/pods/22078090-20eb-466c-a639-3e9206d4f209/volumes" Mar 08 06:11:44 crc kubenswrapper[4717]: I0308 06:11:44.586359 4717 generic.go:334] "Generic (PLEG): container finished" podID="adf01f26-1066-4901-aa10-cd145a720cd6" containerID="51b3d6cddbdf18d599af61a3e52f6c903620687aac43cb3512d882b26abe0c32" exitCode=0 Mar 08 06:11:44 crc kubenswrapper[4717]: I0308 06:11:44.586595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" event={"ID":"adf01f26-1066-4901-aa10-cd145a720cd6","Type":"ContainerDied","Data":"51b3d6cddbdf18d599af61a3e52f6c903620687aac43cb3512d882b26abe0c32"} Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.137313 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159067 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbhlc\" (UniqueName: \"kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159407 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159471 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159589 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.159629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.166444 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0\") pod \"adf01f26-1066-4901-aa10-cd145a720cd6\" (UID: \"adf01f26-1066-4901-aa10-cd145a720cd6\") " Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.173972 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.185891 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc" (OuterVolumeSpecName: "kube-api-access-gbhlc") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "kube-api-access-gbhlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.216290 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.219585 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.223071 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.235335 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.235484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.236305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.236713 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.242399 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.252341 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory" (OuterVolumeSpecName: "inventory") pod "adf01f26-1066-4901-aa10-cd145a720cd6" (UID: "adf01f26-1066-4901-aa10-cd145a720cd6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270269 4717 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/adf01f26-1066-4901-aa10-cd145a720cd6-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270296 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270312 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270326 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbhlc\" (UniqueName: \"kubernetes.io/projected/adf01f26-1066-4901-aa10-cd145a720cd6-kube-api-access-gbhlc\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270339 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270350 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270363 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270375 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270386 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270397 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.270408 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/adf01f26-1066-4901-aa10-cd145a720cd6-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.617417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" event={"ID":"adf01f26-1066-4901-aa10-cd145a720cd6","Type":"ContainerDied","Data":"491bf23d87ea63e60ddce258db791b8b4b605e3f13fc190d23daed38162207f7"} Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.617467 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491bf23d87ea63e60ddce258db791b8b4b605e3f13fc190d23daed38162207f7" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.617532 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rct9m" Mar 08 06:11:46 crc kubenswrapper[4717]: E0308 06:11:46.714818 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadf01f26_1066_4901_aa10_cd145a720cd6.slice\": RecentStats: unable to find data in memory cache]" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.757334 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj"] Mar 08 06:11:46 crc kubenswrapper[4717]: E0308 06:11:46.757895 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="extract-utilities" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.757918 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="extract-utilities" Mar 08 06:11:46 crc kubenswrapper[4717]: E0308 06:11:46.757946 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="extract-content" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.757955 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="extract-content" Mar 08 06:11:46 crc kubenswrapper[4717]: E0308 06:11:46.757977 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="registry-server" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.757985 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="registry-server" Mar 08 06:11:46 crc kubenswrapper[4717]: E0308 06:11:46.758000 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf01f26-1066-4901-aa10-cd145a720cd6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.758012 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf01f26-1066-4901-aa10-cd145a720cd6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.758292 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="22078090-20eb-466c-a639-3e9206d4f209" containerName="registry-server" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.758330 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf01f26-1066-4901-aa10-cd145a720cd6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.759181 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.774129 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.774170 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.774274 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vnjxc" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.774188 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.776933 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.795066 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj"] Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.882836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.883208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.883921 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.883998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jnj\" (UniqueName: \"kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.884043 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.884130 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.884251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.986973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jnj\" (UniqueName: \"kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.987283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:46 crc kubenswrapper[4717]: I0308 06:11:46.998261 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.001412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.005324 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.006054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.007167 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.007982 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.023373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jnj\" (UniqueName: \"kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.090654 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.627625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj"] Mar 08 06:11:47 crc kubenswrapper[4717]: I0308 06:11:47.652548 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" event={"ID":"2dabb1b7-df9b-4b70-94dc-d9e29be0856f","Type":"ContainerStarted","Data":"2961ecda8a7f762b77b3babf572188459738e6de45a9eb883a86d27e64a0167d"} Mar 08 06:11:48 crc kubenswrapper[4717]: I0308 06:11:48.665310 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" event={"ID":"2dabb1b7-df9b-4b70-94dc-d9e29be0856f","Type":"ContainerStarted","Data":"dfa4550eb48aadc4018ab2a179bb7499929a990eaca55c86466c928965a62804"} Mar 08 06:11:48 crc kubenswrapper[4717]: I0308 06:11:48.692237 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" podStartSLOduration=2.101147903 podStartE2EDuration="2.692220323s" podCreationTimestamp="2026-03-08 06:11:46 +0000 UTC" firstStartedPulling="2026-03-08 06:11:47.62380515 +0000 UTC m=+2734.541454004" lastFinishedPulling="2026-03-08 06:11:48.21487754 +0000 UTC m=+2735.132526424" observedRunningTime="2026-03-08 06:11:48.690411298 +0000 UTC m=+2735.608060172" watchObservedRunningTime="2026-03-08 06:11:48.692220323 +0000 UTC m=+2735.609869167" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.161278 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549172-78mbv"] Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.165069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.169154 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.171092 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.171252 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.182661 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549172-78mbv"] Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.208795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j9rn\" (UniqueName: \"kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn\") pod \"auto-csr-approver-29549172-78mbv\" (UID: \"d8331fca-9814-4444-bbbd-81a5ae2bd273\") " pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.311343 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j9rn\" (UniqueName: \"kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn\") pod \"auto-csr-approver-29549172-78mbv\" (UID: \"d8331fca-9814-4444-bbbd-81a5ae2bd273\") " pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.337016 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j9rn\" (UniqueName: \"kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn\") pod \"auto-csr-approver-29549172-78mbv\" (UID: \"d8331fca-9814-4444-bbbd-81a5ae2bd273\") " pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:00 crc kubenswrapper[4717]: I0308 06:12:00.504632 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:01 crc kubenswrapper[4717]: I0308 06:12:01.080907 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549172-78mbv"] Mar 08 06:12:01 crc kubenswrapper[4717]: W0308 06:12:01.086745 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8331fca_9814_4444_bbbd_81a5ae2bd273.slice/crio-a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92 WatchSource:0}: Error finding container a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92: Status 404 returned error can't find the container with id a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92 Mar 08 06:12:01 crc kubenswrapper[4717]: I0308 06:12:01.852664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549172-78mbv" event={"ID":"d8331fca-9814-4444-bbbd-81a5ae2bd273","Type":"ContainerStarted","Data":"a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92"} Mar 08 06:12:02 crc kubenswrapper[4717]: I0308 06:12:02.873354 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8331fca-9814-4444-bbbd-81a5ae2bd273" containerID="fbae62e7036aba11c0a4a54cb0771051eb9a750864a94c8fcc954509da3ba203" exitCode=0 Mar 08 06:12:02 crc kubenswrapper[4717]: I0308 06:12:02.873755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549172-78mbv" event={"ID":"d8331fca-9814-4444-bbbd-81a5ae2bd273","Type":"ContainerDied","Data":"fbae62e7036aba11c0a4a54cb0771051eb9a750864a94c8fcc954509da3ba203"} Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.119925 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.120234 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.269282 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.304636 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j9rn\" (UniqueName: \"kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn\") pod \"d8331fca-9814-4444-bbbd-81a5ae2bd273\" (UID: \"d8331fca-9814-4444-bbbd-81a5ae2bd273\") " Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.333364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn" (OuterVolumeSpecName: "kube-api-access-7j9rn") pod "d8331fca-9814-4444-bbbd-81a5ae2bd273" (UID: "d8331fca-9814-4444-bbbd-81a5ae2bd273"). InnerVolumeSpecName "kube-api-access-7j9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.408827 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j9rn\" (UniqueName: \"kubernetes.io/projected/d8331fca-9814-4444-bbbd-81a5ae2bd273-kube-api-access-7j9rn\") on node \"crc\" DevicePath \"\"" Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.901483 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549172-78mbv" event={"ID":"d8331fca-9814-4444-bbbd-81a5ae2bd273","Type":"ContainerDied","Data":"a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92"} Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.901549 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76342688486f83cf927616ba3b25f30826603a8bcb48748d6f29397d65c1a92" Mar 08 06:12:04 crc kubenswrapper[4717]: I0308 06:12:04.901640 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549172-78mbv" Mar 08 06:12:05 crc kubenswrapper[4717]: I0308 06:12:05.382270 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549166-6jxgc"] Mar 08 06:12:05 crc kubenswrapper[4717]: I0308 06:12:05.397660 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549166-6jxgc"] Mar 08 06:12:05 crc kubenswrapper[4717]: I0308 06:12:05.823196 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3059b4e-640e-4c71-ab74-8bdb58e351c8" path="/var/lib/kubelet/pods/f3059b4e-640e-4c71-ab74-8bdb58e351c8/volumes" Mar 08 06:12:34 crc kubenswrapper[4717]: I0308 06:12:34.119476 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:12:34 crc kubenswrapper[4717]: I0308 06:12:34.120035 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:12:36 crc kubenswrapper[4717]: I0308 06:12:36.733982 4717 scope.go:117] "RemoveContainer" containerID="32aed8448faaf9f858031c040492d3615d92404109e27c8435550707740a7ec8" Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.120198 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.120778 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.120823 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.121559 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.121622 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38" gracePeriod=600 Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.625014 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38" exitCode=0 Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.625058 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38"} Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.625568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f"} Mar 08 06:13:04 crc kubenswrapper[4717]: I0308 06:13:04.625588 4717 scope.go:117] "RemoveContainer" containerID="9a3517002da2793140b158d33622cd871d0041c5550d75ad7898316ba79b176e" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.685549 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:13:29 crc kubenswrapper[4717]: E0308 06:13:29.686739 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8331fca-9814-4444-bbbd-81a5ae2bd273" containerName="oc" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.686755 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8331fca-9814-4444-bbbd-81a5ae2bd273" containerName="oc" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.687000 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8331fca-9814-4444-bbbd-81a5ae2bd273" containerName="oc" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.688966 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.711843 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.873032 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.873098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.873226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnbn6\" (UniqueName: \"kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.975455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbn6\" (UniqueName: \"kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.975744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.975803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.976672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:29 crc kubenswrapper[4717]: I0308 06:13:29.976749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.000064 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnbn6\" (UniqueName: \"kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6\") pod \"redhat-operators-nznqq\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.012063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.524340 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.911227 4717 generic.go:334] "Generic (PLEG): container finished" podID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerID="c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174" exitCode=0 Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.911281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerDied","Data":"c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174"} Mar 08 06:13:30 crc kubenswrapper[4717]: I0308 06:13:30.911617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerStarted","Data":"967c3f01a09bbd7e9541e92ff090e60ed4e1ca3055493025a8d05e6123ab574c"} Mar 08 06:13:32 crc kubenswrapper[4717]: I0308 06:13:32.943055 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerStarted","Data":"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3"} Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.677464 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.679743 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.695919 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.821753 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.821803 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.821841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7pr\" (UniqueName: \"kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.924057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7pr\" (UniqueName: \"kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.924588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.924629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.925113 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.925359 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.943757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7pr\" (UniqueName: \"kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr\") pod \"community-operators-wj96b\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:36 crc kubenswrapper[4717]: I0308 06:13:36.998890 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:37 crc kubenswrapper[4717]: I0308 06:13:37.543215 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:38 crc kubenswrapper[4717]: I0308 06:13:38.007603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerStarted","Data":"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee"} Mar 08 06:13:38 crc kubenswrapper[4717]: I0308 06:13:38.008244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerStarted","Data":"8ce23cf0beedfb0ef9bd4628004dd1ab7ee627b646afe5b3cb7f03bdc14c31bc"} Mar 08 06:13:39 crc kubenswrapper[4717]: I0308 06:13:39.021160 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerID="3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee" exitCode=0 Mar 08 06:13:39 crc kubenswrapper[4717]: I0308 06:13:39.021451 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerDied","Data":"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee"} Mar 08 06:13:41 crc kubenswrapper[4717]: I0308 06:13:41.048950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerStarted","Data":"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7"} Mar 08 06:13:43 crc kubenswrapper[4717]: I0308 06:13:43.068146 4717 generic.go:334] "Generic (PLEG): container finished" podID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerID="e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3" exitCode=0 Mar 08 06:13:43 crc kubenswrapper[4717]: I0308 06:13:43.068198 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerDied","Data":"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3"} Mar 08 06:13:43 crc kubenswrapper[4717]: I0308 06:13:43.071545 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerID="d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7" exitCode=0 Mar 08 06:13:43 crc kubenswrapper[4717]: I0308 06:13:43.071569 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerDied","Data":"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7"} Mar 08 06:13:44 crc kubenswrapper[4717]: I0308 06:13:44.085128 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerStarted","Data":"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29"} Mar 08 06:13:44 crc kubenswrapper[4717]: I0308 06:13:44.088436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerStarted","Data":"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a"} Mar 08 06:13:44 crc kubenswrapper[4717]: I0308 06:13:44.109043 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wj96b" podStartSLOduration=3.6522157760000002 podStartE2EDuration="8.109024655s" podCreationTimestamp="2026-03-08 06:13:36 +0000 UTC" firstStartedPulling="2026-03-08 06:13:39.022961299 +0000 UTC m=+2845.940610153" lastFinishedPulling="2026-03-08 06:13:43.479770188 +0000 UTC m=+2850.397419032" observedRunningTime="2026-03-08 06:13:44.102890214 +0000 UTC m=+2851.020539068" watchObservedRunningTime="2026-03-08 06:13:44.109024655 +0000 UTC m=+2851.026673499" Mar 08 06:13:44 crc kubenswrapper[4717]: I0308 06:13:44.135876 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nznqq" podStartSLOduration=2.531823481 podStartE2EDuration="15.135853602s" podCreationTimestamp="2026-03-08 06:13:29 +0000 UTC" firstStartedPulling="2026-03-08 06:13:30.913339269 +0000 UTC m=+2837.830988113" lastFinishedPulling="2026-03-08 06:13:43.51736938 +0000 UTC m=+2850.435018234" observedRunningTime="2026-03-08 06:13:44.124767161 +0000 UTC m=+2851.042416005" watchObservedRunningTime="2026-03-08 06:13:44.135853602 +0000 UTC m=+2851.053502446" Mar 08 06:13:47 crc kubenswrapper[4717]: I0308 06:13:46.999963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:47 crc kubenswrapper[4717]: I0308 06:13:47.000562 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:48 crc kubenswrapper[4717]: I0308 06:13:48.073677 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wj96b" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="registry-server" probeResult="failure" output=< Mar 08 06:13:48 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:13:48 crc kubenswrapper[4717]: > Mar 08 06:13:50 crc kubenswrapper[4717]: I0308 06:13:50.013147 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:50 crc kubenswrapper[4717]: I0308 06:13:50.013680 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:13:51 crc kubenswrapper[4717]: I0308 06:13:51.065239 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nznqq" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="registry-server" probeResult="failure" output=< Mar 08 06:13:51 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:13:51 crc kubenswrapper[4717]: > Mar 08 06:13:57 crc kubenswrapper[4717]: I0308 06:13:57.073057 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:57 crc kubenswrapper[4717]: I0308 06:13:57.158839 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:57 crc kubenswrapper[4717]: I0308 06:13:57.227113 4717 generic.go:334] "Generic (PLEG): container finished" podID="2dabb1b7-df9b-4b70-94dc-d9e29be0856f" containerID="dfa4550eb48aadc4018ab2a179bb7499929a990eaca55c86466c928965a62804" exitCode=0 Mar 08 06:13:57 crc kubenswrapper[4717]: I0308 06:13:57.227465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" event={"ID":"2dabb1b7-df9b-4b70-94dc-d9e29be0856f","Type":"ContainerDied","Data":"dfa4550eb48aadc4018ab2a179bb7499929a990eaca55c86466c928965a62804"} Mar 08 06:13:57 crc kubenswrapper[4717]: I0308 06:13:57.318755 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:58 crc kubenswrapper[4717]: I0308 06:13:58.238534 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wj96b" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="registry-server" containerID="cri-o://7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29" gracePeriod=2 Mar 08 06:13:58 crc kubenswrapper[4717]: I0308 06:13:58.883609 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:13:58 crc kubenswrapper[4717]: I0308 06:13:58.892184 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.005951 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006027 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities\") pod \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006050 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd7pr\" (UniqueName: \"kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr\") pod \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006088 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006122 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006191 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006240 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content\") pod \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\" (UID: \"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jnj\" (UniqueName: \"kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006313 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.006351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam\") pod \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\" (UID: \"2dabb1b7-df9b-4b70-94dc-d9e29be0856f\") " Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.010528 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities" (OuterVolumeSpecName: "utilities") pod "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" (UID: "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.012295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr" (OuterVolumeSpecName: "kube-api-access-wd7pr") pod "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" (UID: "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1"). InnerVolumeSpecName "kube-api-access-wd7pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.015901 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj" (OuterVolumeSpecName: "kube-api-access-p2jnj") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "kube-api-access-p2jnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.018461 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.041898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory" (OuterVolumeSpecName: "inventory") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.042853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.056216 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.059507 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.062843 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2dabb1b7-df9b-4b70-94dc-d9e29be0856f" (UID: "2dabb1b7-df9b-4b70-94dc-d9e29be0856f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.078769 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" (UID: "ea3afa21-7ec2-4bef-9cd7-06c3e41270d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109140 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109367 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109427 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd7pr\" (UniqueName: \"kubernetes.io/projected/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-kube-api-access-wd7pr\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109480 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109531 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109582 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109638 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109709 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2jnj\" (UniqueName: \"kubernetes.io/projected/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-kube-api-access-p2jnj\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109762 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.109823 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2dabb1b7-df9b-4b70-94dc-d9e29be0856f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.248555 4717 generic.go:334] "Generic (PLEG): container finished" podID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerID="7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29" exitCode=0 Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.248619 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wj96b" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.248625 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerDied","Data":"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29"} Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.248738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wj96b" event={"ID":"ea3afa21-7ec2-4bef-9cd7-06c3e41270d1","Type":"ContainerDied","Data":"8ce23cf0beedfb0ef9bd4628004dd1ab7ee627b646afe5b3cb7f03bdc14c31bc"} Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.248763 4717 scope.go:117] "RemoveContainer" containerID="7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.251423 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" event={"ID":"2dabb1b7-df9b-4b70-94dc-d9e29be0856f","Type":"ContainerDied","Data":"2961ecda8a7f762b77b3babf572188459738e6de45a9eb883a86d27e64a0167d"} Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.251660 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2961ecda8a7f762b77b3babf572188459738e6de45a9eb883a86d27e64a0167d" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.251511 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.285096 4717 scope.go:117] "RemoveContainer" containerID="d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.320843 4717 scope.go:117] "RemoveContainer" containerID="3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.320961 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.331316 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wj96b"] Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.344953 4717 scope.go:117] "RemoveContainer" containerID="7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29" Mar 08 06:13:59 crc kubenswrapper[4717]: E0308 06:13:59.345353 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29\": container with ID starting with 7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29 not found: ID does not exist" containerID="7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.345410 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29"} err="failed to get container status \"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29\": rpc error: code = NotFound desc = could not find container \"7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29\": container with ID starting with 7d8cebca2fd6eb4bb001f4505c5cc1c2237b73cab1140e8d7acfd9d7f7be6c29 not found: ID does not exist" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.345444 4717 scope.go:117] "RemoveContainer" containerID="d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7" Mar 08 06:13:59 crc kubenswrapper[4717]: E0308 06:13:59.345741 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7\": container with ID starting with d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7 not found: ID does not exist" containerID="d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.345774 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7"} err="failed to get container status \"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7\": rpc error: code = NotFound desc = could not find container \"d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7\": container with ID starting with d84f3cc97f2c0d473bc1270b9c6573786205ea9822cb77202d1a498975decba7 not found: ID does not exist" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.345797 4717 scope.go:117] "RemoveContainer" containerID="3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee" Mar 08 06:13:59 crc kubenswrapper[4717]: E0308 06:13:59.362162 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee\": container with ID starting with 3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee not found: ID does not exist" containerID="3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.362224 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee"} err="failed to get container status \"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee\": rpc error: code = NotFound desc = could not find container \"3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee\": container with ID starting with 3fa052cdc33ef9565051de89aac50581a0af1383d6a11e5ed4eb22abc41de3ee not found: ID does not exist" Mar 08 06:13:59 crc kubenswrapper[4717]: I0308 06:13:59.795590 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" path="/var/lib/kubelet/pods/ea3afa21-7ec2-4bef-9cd7-06c3e41270d1/volumes" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.070274 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.183733 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549174-wmnzs"] Mar 08 06:14:00 crc kubenswrapper[4717]: E0308 06:14:00.184249 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dabb1b7-df9b-4b70-94dc-d9e29be0856f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184271 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dabb1b7-df9b-4b70-94dc-d9e29be0856f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 08 06:14:00 crc kubenswrapper[4717]: E0308 06:14:00.184283 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="extract-content" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184290 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="extract-content" Mar 08 06:14:00 crc kubenswrapper[4717]: E0308 06:14:00.184316 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="registry-server" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184323 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="registry-server" Mar 08 06:14:00 crc kubenswrapper[4717]: E0308 06:14:00.184338 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="extract-utilities" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184346 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="extract-utilities" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184596 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dabb1b7-df9b-4b70-94dc-d9e29be0856f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.184626 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3afa21-7ec2-4bef-9cd7-06c3e41270d1" containerName="registry-server" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.185420 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.196501 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549174-wmnzs"] Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.205654 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.220170 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.228612 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.228853 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.237674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg8m5\" (UniqueName: \"kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5\") pod \"auto-csr-approver-29549174-wmnzs\" (UID: \"e75f0624-bcc6-4e55-90f2-f9bae2782042\") " pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.347819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg8m5\" (UniqueName: \"kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5\") pod \"auto-csr-approver-29549174-wmnzs\" (UID: \"e75f0624-bcc6-4e55-90f2-f9bae2782042\") " pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.386876 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg8m5\" (UniqueName: \"kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5\") pod \"auto-csr-approver-29549174-wmnzs\" (UID: \"e75f0624-bcc6-4e55-90f2-f9bae2782042\") " pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:00 crc kubenswrapper[4717]: I0308 06:14:00.513892 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:01 crc kubenswrapper[4717]: I0308 06:14:01.017308 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549174-wmnzs"] Mar 08 06:14:01 crc kubenswrapper[4717]: W0308 06:14:01.022310 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode75f0624_bcc6_4e55_90f2_f9bae2782042.slice/crio-7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642 WatchSource:0}: Error finding container 7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642: Status 404 returned error can't find the container with id 7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642 Mar 08 06:14:01 crc kubenswrapper[4717]: I0308 06:14:01.303019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" event={"ID":"e75f0624-bcc6-4e55-90f2-f9bae2782042","Type":"ContainerStarted","Data":"7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642"} Mar 08 06:14:01 crc kubenswrapper[4717]: I0308 06:14:01.711583 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:14:01 crc kubenswrapper[4717]: I0308 06:14:01.711859 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nznqq" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="registry-server" containerID="cri-o://70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a" gracePeriod=2 Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.253004 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.314952 4717 generic.go:334] "Generic (PLEG): container finished" podID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerID="70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a" exitCode=0 Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.314999 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerDied","Data":"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a"} Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.315029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nznqq" event={"ID":"741fb7e6-b217-4d5a-8825-b9faaa507b8e","Type":"ContainerDied","Data":"967c3f01a09bbd7e9541e92ff090e60ed4e1ca3055493025a8d05e6123ab574c"} Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.315032 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nznqq" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.315073 4717 scope.go:117] "RemoveContainer" containerID="70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.349231 4717 scope.go:117] "RemoveContainer" containerID="e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.375540 4717 scope.go:117] "RemoveContainer" containerID="c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.391506 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities\") pod \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.391585 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnbn6\" (UniqueName: \"kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6\") pod \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.391727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content\") pod \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\" (UID: \"741fb7e6-b217-4d5a-8825-b9faaa507b8e\") " Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.392490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities" (OuterVolumeSpecName: "utilities") pod "741fb7e6-b217-4d5a-8825-b9faaa507b8e" (UID: "741fb7e6-b217-4d5a-8825-b9faaa507b8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.399990 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6" (OuterVolumeSpecName: "kube-api-access-dnbn6") pod "741fb7e6-b217-4d5a-8825-b9faaa507b8e" (UID: "741fb7e6-b217-4d5a-8825-b9faaa507b8e"). InnerVolumeSpecName "kube-api-access-dnbn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.476333 4717 scope.go:117] "RemoveContainer" containerID="70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a" Mar 08 06:14:02 crc kubenswrapper[4717]: E0308 06:14:02.476936 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a\": container with ID starting with 70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a not found: ID does not exist" containerID="70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.477001 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a"} err="failed to get container status \"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a\": rpc error: code = NotFound desc = could not find container \"70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a\": container with ID starting with 70bff5cff6b147a7bc8830451f6f02a7075b31faae0ac41187f745f484d9059a not found: ID does not exist" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.477044 4717 scope.go:117] "RemoveContainer" containerID="e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3" Mar 08 06:14:02 crc kubenswrapper[4717]: E0308 06:14:02.478075 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3\": container with ID starting with e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3 not found: ID does not exist" containerID="e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.478148 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3"} err="failed to get container status \"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3\": rpc error: code = NotFound desc = could not find container \"e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3\": container with ID starting with e4d05195082aa0de8d844c790ec0e4f5e946b9af0cd08130e5bc4697edfdfcb3 not found: ID does not exist" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.478179 4717 scope.go:117] "RemoveContainer" containerID="c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174" Mar 08 06:14:02 crc kubenswrapper[4717]: E0308 06:14:02.478566 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174\": container with ID starting with c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174 not found: ID does not exist" containerID="c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.478595 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174"} err="failed to get container status \"c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174\": rpc error: code = NotFound desc = could not find container \"c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174\": container with ID starting with c56ef44fd4cca7e12627c3c8094906bdf8a9e841ebfd45afa4ea63d5fdb31174 not found: ID does not exist" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.494460 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.494484 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnbn6\" (UniqueName: \"kubernetes.io/projected/741fb7e6-b217-4d5a-8825-b9faaa507b8e-kube-api-access-dnbn6\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.536715 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "741fb7e6-b217-4d5a-8825-b9faaa507b8e" (UID: "741fb7e6-b217-4d5a-8825-b9faaa507b8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.595825 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741fb7e6-b217-4d5a-8825-b9faaa507b8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.667613 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:14:02 crc kubenswrapper[4717]: I0308 06:14:02.681301 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nznqq"] Mar 08 06:14:03 crc kubenswrapper[4717]: I0308 06:14:03.331922 4717 generic.go:334] "Generic (PLEG): container finished" podID="e75f0624-bcc6-4e55-90f2-f9bae2782042" containerID="34eb5ea1c4ffc90b477f6f6c68dd3d62bcf69b3f31a89d25d4089a024fa8137d" exitCode=0 Mar 08 06:14:03 crc kubenswrapper[4717]: I0308 06:14:03.332174 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" event={"ID":"e75f0624-bcc6-4e55-90f2-f9bae2782042","Type":"ContainerDied","Data":"34eb5ea1c4ffc90b477f6f6c68dd3d62bcf69b3f31a89d25d4089a024fa8137d"} Mar 08 06:14:03 crc kubenswrapper[4717]: I0308 06:14:03.805610 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" path="/var/lib/kubelet/pods/741fb7e6-b217-4d5a-8825-b9faaa507b8e/volumes" Mar 08 06:14:04 crc kubenswrapper[4717]: I0308 06:14:04.793594 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:04 crc kubenswrapper[4717]: I0308 06:14:04.847760 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg8m5\" (UniqueName: \"kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5\") pod \"e75f0624-bcc6-4e55-90f2-f9bae2782042\" (UID: \"e75f0624-bcc6-4e55-90f2-f9bae2782042\") " Mar 08 06:14:04 crc kubenswrapper[4717]: I0308 06:14:04.864385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5" (OuterVolumeSpecName: "kube-api-access-sg8m5") pod "e75f0624-bcc6-4e55-90f2-f9bae2782042" (UID: "e75f0624-bcc6-4e55-90f2-f9bae2782042"). InnerVolumeSpecName "kube-api-access-sg8m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:14:04 crc kubenswrapper[4717]: I0308 06:14:04.971972 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg8m5\" (UniqueName: \"kubernetes.io/projected/e75f0624-bcc6-4e55-90f2-f9bae2782042-kube-api-access-sg8m5\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:05 crc kubenswrapper[4717]: I0308 06:14:05.360638 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" event={"ID":"e75f0624-bcc6-4e55-90f2-f9bae2782042","Type":"ContainerDied","Data":"7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642"} Mar 08 06:14:05 crc kubenswrapper[4717]: I0308 06:14:05.361033 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7acee263b55505d24060aaaf1780fe607b42bd758e7d2defdf61ac269f310642" Mar 08 06:14:05 crc kubenswrapper[4717]: I0308 06:14:05.360750 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549174-wmnzs" Mar 08 06:14:05 crc kubenswrapper[4717]: I0308 06:14:05.905331 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549168-4s6cc"] Mar 08 06:14:05 crc kubenswrapper[4717]: I0308 06:14:05.928651 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549168-4s6cc"] Mar 08 06:14:07 crc kubenswrapper[4717]: I0308 06:14:07.802432 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d3b3ca-681b-4df7-8be4-10fc8b68d5d0" path="/var/lib/kubelet/pods/08d3b3ca-681b-4df7-8be4-10fc8b68d5d0/volumes" Mar 08 06:14:36 crc kubenswrapper[4717]: I0308 06:14:36.847912 4717 scope.go:117] "RemoveContainer" containerID="197e6c9ec9f5420e8776698a1747a40ae104f0ac9efc8eac613a986d9875eba9" Mar 08 06:14:42 crc kubenswrapper[4717]: I0308 06:14:42.866717 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:42 crc kubenswrapper[4717]: I0308 06:14:42.867827 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="prometheus" containerID="cri-o://344bfe7e408c4beb924682b8868feaac80462ff38b202cbffdf9c41a89abb55c" gracePeriod=600 Mar 08 06:14:42 crc kubenswrapper[4717]: I0308 06:14:42.868115 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="thanos-sidecar" containerID="cri-o://5362172b23f2bff6327e97fd21c49c93ec075402e3bd2ee3ab189e145635183d" gracePeriod=600 Mar 08 06:14:42 crc kubenswrapper[4717]: I0308 06:14:42.868454 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="config-reloader" containerID="cri-o://42353326c814dc7623b5f7f00583d93616c6e372d17ca1899c8a5c1704ca9032" gracePeriod=600 Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.837912 4717 generic.go:334] "Generic (PLEG): container finished" podID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerID="5362172b23f2bff6327e97fd21c49c93ec075402e3bd2ee3ab189e145635183d" exitCode=0 Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.838227 4717 generic.go:334] "Generic (PLEG): container finished" podID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerID="42353326c814dc7623b5f7f00583d93616c6e372d17ca1899c8a5c1704ca9032" exitCode=0 Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.838239 4717 generic.go:334] "Generic (PLEG): container finished" podID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerID="344bfe7e408c4beb924682b8868feaac80462ff38b202cbffdf9c41a89abb55c" exitCode=0 Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.838011 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerDied","Data":"5362172b23f2bff6327e97fd21c49c93ec075402e3bd2ee3ab189e145635183d"} Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.838275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerDied","Data":"42353326c814dc7623b5f7f00583d93616c6e372d17ca1899c8a5c1704ca9032"} Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.838288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerDied","Data":"344bfe7e408c4beb924682b8868feaac80462ff38b202cbffdf9c41a89abb55c"} Mar 08 06:14:43 crc kubenswrapper[4717]: I0308 06:14:43.991069 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089148 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089205 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089264 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089287 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089617 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.089389 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090114 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090148 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtc5b\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090228 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090250 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090271 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090309 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090366 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle\") pod \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\" (UID: \"d445e3b5-cc85-45e1-bcf7-64090947ac5b\") " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090907 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.090930 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.093152 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.096654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.097392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.096943 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config" (OuterVolumeSpecName: "config") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.099354 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.099489 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.101603 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b" (OuterVolumeSpecName: "kube-api-access-xtc5b") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "kube-api-access-xtc5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.103193 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.105936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out" (OuterVolumeSpecName: "config-out") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.119387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "pvc-6176dbd4-0abf-4276-942d-9f92f0510af7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.167927 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config" (OuterVolumeSpecName: "web-config") pod "d445e3b5-cc85-45e1-bcf7-64090947ac5b" (UID: "d445e3b5-cc85-45e1-bcf7-64090947ac5b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193280 4717 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193325 4717 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193341 4717 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193378 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") on node \"crc\" " Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193397 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtc5b\" (UniqueName: \"kubernetes.io/projected/d445e3b5-cc85-45e1-bcf7-64090947ac5b-kube-api-access-xtc5b\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193411 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193423 4717 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d445e3b5-cc85-45e1-bcf7-64090947ac5b-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193435 4717 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d445e3b5-cc85-45e1-bcf7-64090947ac5b-config-out\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193447 4717 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193462 4717 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.193474 4717 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d445e3b5-cc85-45e1-bcf7-64090947ac5b-web-config\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.232154 4717 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.232311 4717 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6176dbd4-0abf-4276-942d-9f92f0510af7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7") on node "crc" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.295893 4717 reconciler_common.go:293] "Volume detached for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") on node \"crc\" DevicePath \"\"" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.855218 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d445e3b5-cc85-45e1-bcf7-64090947ac5b","Type":"ContainerDied","Data":"cfb8ef99e074569a08fab4e51da9e9a93487b4e588524137310649d1ea370f17"} Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.855273 4717 scope.go:117] "RemoveContainer" containerID="5362172b23f2bff6327e97fd21c49c93ec075402e3bd2ee3ab189e145635183d" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.855320 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.894273 4717 scope.go:117] "RemoveContainer" containerID="42353326c814dc7623b5f7f00583d93616c6e372d17ca1899c8a5c1704ca9032" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.922275 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.938054 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.965541 4717 scope.go:117] "RemoveContainer" containerID="344bfe7e408c4beb924682b8868feaac80462ff38b202cbffdf9c41a89abb55c" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.972293 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984310 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="init-config-reloader" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984358 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="init-config-reloader" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984387 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="extract-utilities" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984394 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="extract-utilities" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984407 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75f0624-bcc6-4e55-90f2-f9bae2782042" containerName="oc" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984414 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75f0624-bcc6-4e55-90f2-f9bae2782042" containerName="oc" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984432 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="config-reloader" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984438 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="config-reloader" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984474 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="thanos-sidecar" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984480 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="thanos-sidecar" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984500 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="extract-content" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984506 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="extract-content" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984530 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="prometheus" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984536 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="prometheus" Mar 08 06:14:44 crc kubenswrapper[4717]: E0308 06:14:44.984559 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="registry-server" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984565 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="registry-server" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984885 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75f0624-bcc6-4e55-90f2-f9bae2782042" containerName="oc" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984903 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="741fb7e6-b217-4d5a-8825-b9faaa507b8e" containerName="registry-server" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984929 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="config-reloader" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984937 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="thanos-sidecar" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.984959 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" containerName="prometheus" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.989693 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.991344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.994577 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.994621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.994578 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.994926 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.995196 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.997369 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 08 06:14:44 crc kubenswrapper[4717]: I0308 06:14:44.997647 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ckzwm" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.010760 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.034513 4717 scope.go:117] "RemoveContainer" containerID="da9e55b4b4db08e5866e96ab67da58b228ca3ddb7a01e67275e972c7d3dbf661" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.114959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115024 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrl2\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-kube-api-access-msrl2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115359 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115382 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115408 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115479 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115579 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.115635 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.216952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217560 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217766 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.217923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218009 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218166 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrl2\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-kube-api-access-msrl2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218311 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218565 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.218572 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.219200 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.221652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.222306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.222328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.222423 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.222464 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b9ef073caa6ec1ae4d35eecfe80ee2af5cbcdd85b8b9ead8efa911e24063287d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.223664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.223974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.226840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.233617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.238724 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrl2\" (UniqueName: \"kubernetes.io/projected/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-kube-api-access-msrl2\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.239198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f60230bf-f6a0-4a30-8d32-fd3ec01cf27a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.262150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6176dbd4-0abf-4276-942d-9f92f0510af7\") pod \"prometheus-metric-storage-0\" (UID: \"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a\") " pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.332920 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.794772 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d445e3b5-cc85-45e1-bcf7-64090947ac5b" path="/var/lib/kubelet/pods/d445e3b5-cc85-45e1-bcf7-64090947ac5b/volumes" Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.798132 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 06:14:45 crc kubenswrapper[4717]: W0308 06:14:45.811880 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60230bf_f6a0_4a30_8d32_fd3ec01cf27a.slice/crio-7d3860db1ec474d5c4f6dab59b78d9185eb3ea8d086b567d9822e9785021dd6a WatchSource:0}: Error finding container 7d3860db1ec474d5c4f6dab59b78d9185eb3ea8d086b567d9822e9785021dd6a: Status 404 returned error can't find the container with id 7d3860db1ec474d5c4f6dab59b78d9185eb3ea8d086b567d9822e9785021dd6a Mar 08 06:14:45 crc kubenswrapper[4717]: I0308 06:14:45.873617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerStarted","Data":"7d3860db1ec474d5c4f6dab59b78d9185eb3ea8d086b567d9822e9785021dd6a"} Mar 08 06:14:50 crc kubenswrapper[4717]: I0308 06:14:50.931477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerStarted","Data":"81150c5aa7b99ad6c9bf94a50a5bb83b7667210312d8507e05823755be277270"} Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.050278 4717 generic.go:334] "Generic (PLEG): container finished" podID="f60230bf-f6a0-4a30-8d32-fd3ec01cf27a" containerID="81150c5aa7b99ad6c9bf94a50a5bb83b7667210312d8507e05823755be277270" exitCode=0 Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.050403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerDied","Data":"81150c5aa7b99ad6c9bf94a50a5bb83b7667210312d8507e05823755be277270"} Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.170045 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm"] Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.173385 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.176890 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.177256 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.185232 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm"] Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.263380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.263501 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzsdh\" (UniqueName: \"kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.263654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.365931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.366065 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.366326 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzsdh\" (UniqueName: \"kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.369607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.371091 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.394545 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzsdh\" (UniqueName: \"kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh\") pod \"collect-profiles-29549175-df2jm\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:00 crc kubenswrapper[4717]: I0308 06:15:00.550976 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:01 crc kubenswrapper[4717]: I0308 06:15:01.049257 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm"] Mar 08 06:15:01 crc kubenswrapper[4717]: W0308 06:15:01.050857 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a9fed6_7639_4038_bc62_0b0ec30c1772.slice/crio-e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe WatchSource:0}: Error finding container e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe: Status 404 returned error can't find the container with id e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe Mar 08 06:15:01 crc kubenswrapper[4717]: I0308 06:15:01.062168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerStarted","Data":"711f93ea476227158597f1d9171d0a6057ac42b673e65c4e9dac66104f1debd3"} Mar 08 06:15:02 crc kubenswrapper[4717]: I0308 06:15:02.079536 4717 generic.go:334] "Generic (PLEG): container finished" podID="72a9fed6-7639-4038-bc62-0b0ec30c1772" containerID="424b8ac215b5a68f15fcef7c64a1052c4ce7fd010b1d09ddc06daf94bc750620" exitCode=0 Mar 08 06:15:02 crc kubenswrapper[4717]: I0308 06:15:02.079585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" event={"ID":"72a9fed6-7639-4038-bc62-0b0ec30c1772","Type":"ContainerDied","Data":"424b8ac215b5a68f15fcef7c64a1052c4ce7fd010b1d09ddc06daf94bc750620"} Mar 08 06:15:02 crc kubenswrapper[4717]: I0308 06:15:02.079885 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" event={"ID":"72a9fed6-7639-4038-bc62-0b0ec30c1772","Type":"ContainerStarted","Data":"e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe"} Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.705647 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.745407 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzsdh\" (UniqueName: \"kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh\") pod \"72a9fed6-7639-4038-bc62-0b0ec30c1772\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.745456 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume\") pod \"72a9fed6-7639-4038-bc62-0b0ec30c1772\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.745601 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume\") pod \"72a9fed6-7639-4038-bc62-0b0ec30c1772\" (UID: \"72a9fed6-7639-4038-bc62-0b0ec30c1772\") " Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.747912 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume" (OuterVolumeSpecName: "config-volume") pod "72a9fed6-7639-4038-bc62-0b0ec30c1772" (UID: "72a9fed6-7639-4038-bc62-0b0ec30c1772"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.754926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "72a9fed6-7639-4038-bc62-0b0ec30c1772" (UID: "72a9fed6-7639-4038-bc62-0b0ec30c1772"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.754932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh" (OuterVolumeSpecName: "kube-api-access-fzsdh") pod "72a9fed6-7639-4038-bc62-0b0ec30c1772" (UID: "72a9fed6-7639-4038-bc62-0b0ec30c1772"). InnerVolumeSpecName "kube-api-access-fzsdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.851539 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72a9fed6-7639-4038-bc62-0b0ec30c1772-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.851607 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzsdh\" (UniqueName: \"kubernetes.io/projected/72a9fed6-7639-4038-bc62-0b0ec30c1772-kube-api-access-fzsdh\") on node \"crc\" DevicePath \"\"" Mar 08 06:15:03 crc kubenswrapper[4717]: I0308 06:15:03.851628 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72a9fed6-7639-4038-bc62-0b0ec30c1772-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.108528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" event={"ID":"72a9fed6-7639-4038-bc62-0b0ec30c1772","Type":"ContainerDied","Data":"e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe"} Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.108573 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b35dd3fea44d25f027b8073d7ea7b7f98e5bb71445792a73734dc6877da1fe" Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.108621 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm" Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.120065 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.120488 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.825135 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4"] Mar 08 06:15:04 crc kubenswrapper[4717]: I0308 06:15:04.838827 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549130-zppk4"] Mar 08 06:15:05 crc kubenswrapper[4717]: I0308 06:15:05.122758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerStarted","Data":"6ff42a1e898222eab9f2602220f24e108cfae58cec8ba3dcc9c441581d44b25b"} Mar 08 06:15:05 crc kubenswrapper[4717]: I0308 06:15:05.803254 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e14e94f-4d73-4184-837c-9f7ae6e57b20" path="/var/lib/kubelet/pods/1e14e94f-4d73-4184-837c-9f7ae6e57b20/volumes" Mar 08 06:15:06 crc kubenswrapper[4717]: I0308 06:15:06.134711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f60230bf-f6a0-4a30-8d32-fd3ec01cf27a","Type":"ContainerStarted","Data":"7b3da657e7d8dcc27a3840248688ec4a5cbbc47b2113f5ae40ebddb5cff9eee0"} Mar 08 06:15:06 crc kubenswrapper[4717]: I0308 06:15:06.187501 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.187476467 podStartE2EDuration="22.187476467s" podCreationTimestamp="2026-03-08 06:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 06:15:06.177710388 +0000 UTC m=+2933.095359232" watchObservedRunningTime="2026-03-08 06:15:06.187476467 +0000 UTC m=+2933.105125311" Mar 08 06:15:10 crc kubenswrapper[4717]: I0308 06:15:10.333736 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 08 06:15:15 crc kubenswrapper[4717]: I0308 06:15:15.334128 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 08 06:15:15 crc kubenswrapper[4717]: I0308 06:15:15.341081 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 08 06:15:16 crc kubenswrapper[4717]: I0308 06:15:16.279170 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.949801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 08 06:15:33 crc kubenswrapper[4717]: E0308 06:15:33.951049 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a9fed6-7639-4038-bc62-0b0ec30c1772" containerName="collect-profiles" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.951072 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a9fed6-7639-4038-bc62-0b0ec30c1772" containerName="collect-profiles" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.951447 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a9fed6-7639-4038-bc62-0b0ec30c1772" containerName="collect-profiles" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.952533 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.955996 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.956115 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.956123 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w5njs" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.956000 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 08 06:15:33 crc kubenswrapper[4717]: I0308 06:15:33.971916 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.048533 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.048593 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.048616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.048909 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.049059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2qwn\" (UniqueName: \"kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.049133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.049385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.049527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.049652 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.119925 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.119983 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.152654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.153256 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154015 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.153197 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154220 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2qwn\" (UniqueName: \"kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154427 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154541 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.154766 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.155036 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.155063 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.156232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.160422 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.161636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.168285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.172853 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2qwn\" (UniqueName: \"kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.215538 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.284899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.581615 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 08 06:15:34 crc kubenswrapper[4717]: I0308 06:15:34.593045 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:15:35 crc kubenswrapper[4717]: I0308 06:15:35.548349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0e0647cd-807a-44fc-a1e0-f5ce609b835d","Type":"ContainerStarted","Data":"d520811e1c7616ff6434ff78047258ddac782960a76a26a86428bc6c652af646"} Mar 08 06:15:37 crc kubenswrapper[4717]: I0308 06:15:37.013520 4717 scope.go:117] "RemoveContainer" containerID="278121bbcd6a10c1b105beaafc10b9c8eae6fef1ca448bfa6dd5efac355dbf07" Mar 08 06:15:49 crc kubenswrapper[4717]: I0308 06:15:49.684754 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0e0647cd-807a-44fc-a1e0-f5ce609b835d","Type":"ContainerStarted","Data":"ecc674a5ac53902011c233ddb8c66be59ee08e6485b51ba17f5f51c4176e67a8"} Mar 08 06:15:49 crc kubenswrapper[4717]: I0308 06:15:49.724467 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.741816287 podStartE2EDuration="17.724395359s" podCreationTimestamp="2026-03-08 06:15:32 +0000 UTC" firstStartedPulling="2026-03-08 06:15:34.59280929 +0000 UTC m=+2961.510458144" lastFinishedPulling="2026-03-08 06:15:48.575388352 +0000 UTC m=+2975.493037216" observedRunningTime="2026-03-08 06:15:49.709318139 +0000 UTC m=+2976.626966993" watchObservedRunningTime="2026-03-08 06:15:49.724395359 +0000 UTC m=+2976.642044243" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.175229 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549176-dkztq"] Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.181648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.186204 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.186210 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.186627 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.187052 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549176-dkztq"] Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.294423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh\") pod \"auto-csr-approver-29549176-dkztq\" (UID: \"92a0fb95-6ecb-4e05-9585-ab3757f20000\") " pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.396188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh\") pod \"auto-csr-approver-29549176-dkztq\" (UID: \"92a0fb95-6ecb-4e05-9585-ab3757f20000\") " pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.435604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh\") pod \"auto-csr-approver-29549176-dkztq\" (UID: \"92a0fb95-6ecb-4e05-9585-ab3757f20000\") " pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:00 crc kubenswrapper[4717]: I0308 06:16:00.518070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:01 crc kubenswrapper[4717]: I0308 06:16:01.086215 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549176-dkztq"] Mar 08 06:16:01 crc kubenswrapper[4717]: I0308 06:16:01.817886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549176-dkztq" event={"ID":"92a0fb95-6ecb-4e05-9585-ab3757f20000","Type":"ContainerStarted","Data":"fd46e123830f98e20bee0d778800d1c1513193419660a03f2d6d9cec47c265ef"} Mar 08 06:16:02 crc kubenswrapper[4717]: I0308 06:16:02.832457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549176-dkztq" event={"ID":"92a0fb95-6ecb-4e05-9585-ab3757f20000","Type":"ContainerStarted","Data":"ba05100014f43c0d4cf347d295d68bdfc53996477cd7c5e07b0ca278cef47991"} Mar 08 06:16:02 crc kubenswrapper[4717]: I0308 06:16:02.854028 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549176-dkztq" podStartSLOduration=2.054137187 podStartE2EDuration="2.854003196s" podCreationTimestamp="2026-03-08 06:16:00 +0000 UTC" firstStartedPulling="2026-03-08 06:16:01.08043064 +0000 UTC m=+2987.998079484" lastFinishedPulling="2026-03-08 06:16:01.880296609 +0000 UTC m=+2988.797945493" observedRunningTime="2026-03-08 06:16:02.85131975 +0000 UTC m=+2989.768968634" watchObservedRunningTime="2026-03-08 06:16:02.854003196 +0000 UTC m=+2989.771652080" Mar 08 06:16:03 crc kubenswrapper[4717]: I0308 06:16:03.847290 4717 generic.go:334] "Generic (PLEG): container finished" podID="92a0fb95-6ecb-4e05-9585-ab3757f20000" containerID="ba05100014f43c0d4cf347d295d68bdfc53996477cd7c5e07b0ca278cef47991" exitCode=0 Mar 08 06:16:03 crc kubenswrapper[4717]: I0308 06:16:03.847973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549176-dkztq" event={"ID":"92a0fb95-6ecb-4e05-9585-ab3757f20000","Type":"ContainerDied","Data":"ba05100014f43c0d4cf347d295d68bdfc53996477cd7c5e07b0ca278cef47991"} Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.119752 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.119826 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.119876 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.120452 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.120522 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" gracePeriod=600 Mar 08 06:16:04 crc kubenswrapper[4717]: E0308 06:16:04.263644 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.877679 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" exitCode=0 Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.878326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f"} Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.878379 4717 scope.go:117] "RemoveContainer" containerID="c73feaa6bdbaae9017541af3e3f747b21017b402fcea1c7c8f93223332b01f38" Mar 08 06:16:04 crc kubenswrapper[4717]: I0308 06:16:04.879184 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:16:04 crc kubenswrapper[4717]: E0308 06:16:04.879659 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.343617 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.407371 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh\") pod \"92a0fb95-6ecb-4e05-9585-ab3757f20000\" (UID: \"92a0fb95-6ecb-4e05-9585-ab3757f20000\") " Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.419958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh" (OuterVolumeSpecName: "kube-api-access-nj6bh") pod "92a0fb95-6ecb-4e05-9585-ab3757f20000" (UID: "92a0fb95-6ecb-4e05-9585-ab3757f20000"). InnerVolumeSpecName "kube-api-access-nj6bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.509774 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/92a0fb95-6ecb-4e05-9585-ab3757f20000-kube-api-access-nj6bh\") on node \"crc\" DevicePath \"\"" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.930669 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549176-dkztq" event={"ID":"92a0fb95-6ecb-4e05-9585-ab3757f20000","Type":"ContainerDied","Data":"fd46e123830f98e20bee0d778800d1c1513193419660a03f2d6d9cec47c265ef"} Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.932002 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd46e123830f98e20bee0d778800d1c1513193419660a03f2d6d9cec47c265ef" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.930856 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549176-dkztq" Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.963410 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549170-5jt9q"] Mar 08 06:16:05 crc kubenswrapper[4717]: I0308 06:16:05.974227 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549170-5jt9q"] Mar 08 06:16:07 crc kubenswrapper[4717]: I0308 06:16:07.797558 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8e8e22-0684-4303-9add-a77e75b09d31" path="/var/lib/kubelet/pods/2f8e8e22-0684-4303-9add-a77e75b09d31/volumes" Mar 08 06:16:16 crc kubenswrapper[4717]: I0308 06:16:16.781886 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:16:16 crc kubenswrapper[4717]: E0308 06:16:16.782925 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:16:30 crc kubenswrapper[4717]: I0308 06:16:30.782826 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:16:30 crc kubenswrapper[4717]: E0308 06:16:30.783772 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:16:37 crc kubenswrapper[4717]: I0308 06:16:37.112566 4717 scope.go:117] "RemoveContainer" containerID="c7923a259b2780e72f00842b87a806d77ce97d691d1474751182905e9c1f57dd" Mar 08 06:16:42 crc kubenswrapper[4717]: I0308 06:16:42.783137 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:16:42 crc kubenswrapper[4717]: E0308 06:16:42.784407 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:16:57 crc kubenswrapper[4717]: I0308 06:16:57.782498 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:16:57 crc kubenswrapper[4717]: E0308 06:16:57.783598 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.753514 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:04 crc kubenswrapper[4717]: E0308 06:17:04.754458 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a0fb95-6ecb-4e05-9585-ab3757f20000" containerName="oc" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.754471 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a0fb95-6ecb-4e05-9585-ab3757f20000" containerName="oc" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.754693 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a0fb95-6ecb-4e05-9585-ab3757f20000" containerName="oc" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.755994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.800107 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.800619 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.800838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7kb\" (UniqueName: \"kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.820776 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.902712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.902841 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7kb\" (UniqueName: \"kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.902985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.904079 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.904295 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:04 crc kubenswrapper[4717]: I0308 06:17:04.927632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7kb\" (UniqueName: \"kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb\") pod \"certified-operators-kqdcn\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:05 crc kubenswrapper[4717]: I0308 06:17:05.080523 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:05 crc kubenswrapper[4717]: I0308 06:17:05.616737 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:05 crc kubenswrapper[4717]: I0308 06:17:05.629751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerStarted","Data":"88057f8e428cf4a990235a4aade014e4a5949a7bde08f80a479a8fb2afe10399"} Mar 08 06:17:06 crc kubenswrapper[4717]: I0308 06:17:06.639082 4717 generic.go:334] "Generic (PLEG): container finished" podID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerID="acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9" exitCode=0 Mar 08 06:17:06 crc kubenswrapper[4717]: I0308 06:17:06.639180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerDied","Data":"acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9"} Mar 08 06:17:07 crc kubenswrapper[4717]: I0308 06:17:07.654342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerStarted","Data":"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd"} Mar 08 06:17:09 crc kubenswrapper[4717]: I0308 06:17:09.677532 4717 generic.go:334] "Generic (PLEG): container finished" podID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerID="16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd" exitCode=0 Mar 08 06:17:09 crc kubenswrapper[4717]: I0308 06:17:09.677649 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerDied","Data":"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd"} Mar 08 06:17:09 crc kubenswrapper[4717]: I0308 06:17:09.781738 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:17:09 crc kubenswrapper[4717]: E0308 06:17:09.782055 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:17:10 crc kubenswrapper[4717]: I0308 06:17:10.694167 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerStarted","Data":"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34"} Mar 08 06:17:10 crc kubenswrapper[4717]: I0308 06:17:10.733517 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kqdcn" podStartSLOduration=3.295674999 podStartE2EDuration="6.733486787s" podCreationTimestamp="2026-03-08 06:17:04 +0000 UTC" firstStartedPulling="2026-03-08 06:17:06.64092627 +0000 UTC m=+3053.558575114" lastFinishedPulling="2026-03-08 06:17:10.078738048 +0000 UTC m=+3056.996386902" observedRunningTime="2026-03-08 06:17:10.719140835 +0000 UTC m=+3057.636789729" watchObservedRunningTime="2026-03-08 06:17:10.733486787 +0000 UTC m=+3057.651135671" Mar 08 06:17:15 crc kubenswrapper[4717]: I0308 06:17:15.081513 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:15 crc kubenswrapper[4717]: I0308 06:17:15.082107 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:16 crc kubenswrapper[4717]: I0308 06:17:16.148279 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kqdcn" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="registry-server" probeResult="failure" output=< Mar 08 06:17:16 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:17:16 crc kubenswrapper[4717]: > Mar 08 06:17:22 crc kubenswrapper[4717]: I0308 06:17:22.782512 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:17:22 crc kubenswrapper[4717]: E0308 06:17:22.783812 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:17:25 crc kubenswrapper[4717]: I0308 06:17:25.167049 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:25 crc kubenswrapper[4717]: I0308 06:17:25.237786 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:26 crc kubenswrapper[4717]: I0308 06:17:26.587762 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:26 crc kubenswrapper[4717]: I0308 06:17:26.888513 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kqdcn" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="registry-server" containerID="cri-o://033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34" gracePeriod=2 Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.515973 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.621029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj7kb\" (UniqueName: \"kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb\") pod \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.621176 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content\") pod \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.621346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities\") pod \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\" (UID: \"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093\") " Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.623058 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities" (OuterVolumeSpecName: "utilities") pod "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" (UID: "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.629030 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb" (OuterVolumeSpecName: "kube-api-access-lj7kb") pod "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" (UID: "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093"). InnerVolumeSpecName "kube-api-access-lj7kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.687449 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" (UID: "c2d0d7b6-3c82-45a6-9bb9-bec7841f6093"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.724476 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.724915 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.724937 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj7kb\" (UniqueName: \"kubernetes.io/projected/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093-kube-api-access-lj7kb\") on node \"crc\" DevicePath \"\"" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.901999 4717 generic.go:334] "Generic (PLEG): container finished" podID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerID="033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34" exitCode=0 Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.902043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerDied","Data":"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34"} Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.902073 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kqdcn" event={"ID":"c2d0d7b6-3c82-45a6-9bb9-bec7841f6093","Type":"ContainerDied","Data":"88057f8e428cf4a990235a4aade014e4a5949a7bde08f80a479a8fb2afe10399"} Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.902096 4717 scope.go:117] "RemoveContainer" containerID="033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.902094 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kqdcn" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.943654 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.950269 4717 scope.go:117] "RemoveContainer" containerID="16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd" Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.958018 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kqdcn"] Mar 08 06:17:27 crc kubenswrapper[4717]: I0308 06:17:27.977455 4717 scope.go:117] "RemoveContainer" containerID="acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.032351 4717 scope.go:117] "RemoveContainer" containerID="033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34" Mar 08 06:17:28 crc kubenswrapper[4717]: E0308 06:17:28.032892 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34\": container with ID starting with 033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34 not found: ID does not exist" containerID="033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.032971 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34"} err="failed to get container status \"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34\": rpc error: code = NotFound desc = could not find container \"033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34\": container with ID starting with 033463ce159bc8c74e4696db6d5a7842ad5026731c0152ba731a083d76deeb34 not found: ID does not exist" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.033001 4717 scope.go:117] "RemoveContainer" containerID="16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd" Mar 08 06:17:28 crc kubenswrapper[4717]: E0308 06:17:28.033311 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd\": container with ID starting with 16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd not found: ID does not exist" containerID="16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.033353 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd"} err="failed to get container status \"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd\": rpc error: code = NotFound desc = could not find container \"16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd\": container with ID starting with 16d0c7f3d3bb365b31bf3013b5f12cdd13d1f1ab387f201a7ff1a4c5ab51bedd not found: ID does not exist" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.033379 4717 scope.go:117] "RemoveContainer" containerID="acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9" Mar 08 06:17:28 crc kubenswrapper[4717]: E0308 06:17:28.033626 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9\": container with ID starting with acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9 not found: ID does not exist" containerID="acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9" Mar 08 06:17:28 crc kubenswrapper[4717]: I0308 06:17:28.033652 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9"} err="failed to get container status \"acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9\": rpc error: code = NotFound desc = could not find container \"acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9\": container with ID starting with acf17669827f344b983e2e3782e4a3c3bcaf4b92f6becb1b2eebd22cb298fea9 not found: ID does not exist" Mar 08 06:17:29 crc kubenswrapper[4717]: I0308 06:17:29.800382 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" path="/var/lib/kubelet/pods/c2d0d7b6-3c82-45a6-9bb9-bec7841f6093/volumes" Mar 08 06:17:35 crc kubenswrapper[4717]: I0308 06:17:35.783418 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:17:35 crc kubenswrapper[4717]: E0308 06:17:35.784502 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:17:49 crc kubenswrapper[4717]: I0308 06:17:49.782152 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:17:49 crc kubenswrapper[4717]: E0308 06:17:49.783373 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.151550 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549178-zphcw"] Mar 08 06:18:00 crc kubenswrapper[4717]: E0308 06:18:00.152966 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="extract-content" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.152987 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="extract-content" Mar 08 06:18:00 crc kubenswrapper[4717]: E0308 06:18:00.153002 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="registry-server" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.153011 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="registry-server" Mar 08 06:18:00 crc kubenswrapper[4717]: E0308 06:18:00.153072 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="extract-utilities" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.153085 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="extract-utilities" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.153399 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d0d7b6-3c82-45a6-9bb9-bec7841f6093" containerName="registry-server" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.154489 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.156916 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.157598 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.157660 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.163853 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549178-zphcw"] Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.245605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxk9\" (UniqueName: \"kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9\") pod \"auto-csr-approver-29549178-zphcw\" (UID: \"2d940c69-ce7d-45dc-be11-463c7e8b0b9e\") " pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.348373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxk9\" (UniqueName: \"kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9\") pod \"auto-csr-approver-29549178-zphcw\" (UID: \"2d940c69-ce7d-45dc-be11-463c7e8b0b9e\") " pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.370704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxk9\" (UniqueName: \"kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9\") pod \"auto-csr-approver-29549178-zphcw\" (UID: \"2d940c69-ce7d-45dc-be11-463c7e8b0b9e\") " pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.475396 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:00 crc kubenswrapper[4717]: I0308 06:18:00.972939 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549178-zphcw"] Mar 08 06:18:01 crc kubenswrapper[4717]: I0308 06:18:01.275790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549178-zphcw" event={"ID":"2d940c69-ce7d-45dc-be11-463c7e8b0b9e","Type":"ContainerStarted","Data":"1148a383ae7ca7cb5f1141b90dbebbd943fda328d7496eed4ae6a0760068deba"} Mar 08 06:18:02 crc kubenswrapper[4717]: I0308 06:18:02.289513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549178-zphcw" event={"ID":"2d940c69-ce7d-45dc-be11-463c7e8b0b9e","Type":"ContainerStarted","Data":"82f863aa82489fe42508f6dd874792d823b6fc89b2c6c2be4aac047c85318af4"} Mar 08 06:18:02 crc kubenswrapper[4717]: I0308 06:18:02.782329 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:18:02 crc kubenswrapper[4717]: E0308 06:18:02.782590 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:18:03 crc kubenswrapper[4717]: I0308 06:18:03.305193 4717 generic.go:334] "Generic (PLEG): container finished" podID="2d940c69-ce7d-45dc-be11-463c7e8b0b9e" containerID="82f863aa82489fe42508f6dd874792d823b6fc89b2c6c2be4aac047c85318af4" exitCode=0 Mar 08 06:18:03 crc kubenswrapper[4717]: I0308 06:18:03.305504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549178-zphcw" event={"ID":"2d940c69-ce7d-45dc-be11-463c7e8b0b9e","Type":"ContainerDied","Data":"82f863aa82489fe42508f6dd874792d823b6fc89b2c6c2be4aac047c85318af4"} Mar 08 06:18:04 crc kubenswrapper[4717]: I0308 06:18:04.806599 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:04 crc kubenswrapper[4717]: I0308 06:18:04.855398 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxk9\" (UniqueName: \"kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9\") pod \"2d940c69-ce7d-45dc-be11-463c7e8b0b9e\" (UID: \"2d940c69-ce7d-45dc-be11-463c7e8b0b9e\") " Mar 08 06:18:04 crc kubenswrapper[4717]: I0308 06:18:04.861605 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9" (OuterVolumeSpecName: "kube-api-access-wnxk9") pod "2d940c69-ce7d-45dc-be11-463c7e8b0b9e" (UID: "2d940c69-ce7d-45dc-be11-463c7e8b0b9e"). InnerVolumeSpecName "kube-api-access-wnxk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:18:04 crc kubenswrapper[4717]: I0308 06:18:04.958323 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxk9\" (UniqueName: \"kubernetes.io/projected/2d940c69-ce7d-45dc-be11-463c7e8b0b9e-kube-api-access-wnxk9\") on node \"crc\" DevicePath \"\"" Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.343911 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549178-zphcw" event={"ID":"2d940c69-ce7d-45dc-be11-463c7e8b0b9e","Type":"ContainerDied","Data":"1148a383ae7ca7cb5f1141b90dbebbd943fda328d7496eed4ae6a0760068deba"} Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.344342 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1148a383ae7ca7cb5f1141b90dbebbd943fda328d7496eed4ae6a0760068deba" Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.343981 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549178-zphcw" Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.396165 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549172-78mbv"] Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.407714 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549172-78mbv"] Mar 08 06:18:05 crc kubenswrapper[4717]: I0308 06:18:05.795166 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8331fca-9814-4444-bbbd-81a5ae2bd273" path="/var/lib/kubelet/pods/d8331fca-9814-4444-bbbd-81a5ae2bd273/volumes" Mar 08 06:18:14 crc kubenswrapper[4717]: I0308 06:18:14.782561 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:18:14 crc kubenswrapper[4717]: E0308 06:18:14.783758 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:18:26 crc kubenswrapper[4717]: I0308 06:18:26.781958 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:18:26 crc kubenswrapper[4717]: E0308 06:18:26.783093 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:18:37 crc kubenswrapper[4717]: I0308 06:18:37.285722 4717 scope.go:117] "RemoveContainer" containerID="fbae62e7036aba11c0a4a54cb0771051eb9a750864a94c8fcc954509da3ba203" Mar 08 06:18:37 crc kubenswrapper[4717]: I0308 06:18:37.782360 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:18:37 crc kubenswrapper[4717]: E0308 06:18:37.785345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:18:51 crc kubenswrapper[4717]: I0308 06:18:51.782224 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:18:51 crc kubenswrapper[4717]: E0308 06:18:51.783774 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:19:03 crc kubenswrapper[4717]: I0308 06:19:03.794971 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:19:03 crc kubenswrapper[4717]: E0308 06:19:03.798261 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:19:15 crc kubenswrapper[4717]: I0308 06:19:15.781654 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:19:15 crc kubenswrapper[4717]: E0308 06:19:15.782787 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:19:27 crc kubenswrapper[4717]: I0308 06:19:27.782495 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:19:27 crc kubenswrapper[4717]: E0308 06:19:27.783661 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:19:42 crc kubenswrapper[4717]: I0308 06:19:42.782070 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:19:42 crc kubenswrapper[4717]: E0308 06:19:42.783156 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:19:54 crc kubenswrapper[4717]: I0308 06:19:54.781652 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:19:54 crc kubenswrapper[4717]: E0308 06:19:54.782838 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.163530 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549180-mnjdx"] Mar 08 06:20:00 crc kubenswrapper[4717]: E0308 06:20:00.164554 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d940c69-ce7d-45dc-be11-463c7e8b0b9e" containerName="oc" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.164571 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d940c69-ce7d-45dc-be11-463c7e8b0b9e" containerName="oc" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.164920 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d940c69-ce7d-45dc-be11-463c7e8b0b9e" containerName="oc" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.165804 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.168305 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.168975 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.172166 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.180544 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549180-mnjdx"] Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.229457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtpg\" (UniqueName: \"kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg\") pod \"auto-csr-approver-29549180-mnjdx\" (UID: \"5a3959ef-9a09-4399-9a35-299eabd577f4\") " pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.332374 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtpg\" (UniqueName: \"kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg\") pod \"auto-csr-approver-29549180-mnjdx\" (UID: \"5a3959ef-9a09-4399-9a35-299eabd577f4\") " pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.358858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtpg\" (UniqueName: \"kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg\") pod \"auto-csr-approver-29549180-mnjdx\" (UID: \"5a3959ef-9a09-4399-9a35-299eabd577f4\") " pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.504254 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:00 crc kubenswrapper[4717]: I0308 06:20:00.787219 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549180-mnjdx"] Mar 08 06:20:01 crc kubenswrapper[4717]: I0308 06:20:01.718806 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" event={"ID":"5a3959ef-9a09-4399-9a35-299eabd577f4","Type":"ContainerStarted","Data":"ddf604f8333d6341f1d6503f44254c9deaeab3225a3e7ca5dd00b8a0e93126b7"} Mar 08 06:20:02 crc kubenswrapper[4717]: I0308 06:20:02.737253 4717 generic.go:334] "Generic (PLEG): container finished" podID="5a3959ef-9a09-4399-9a35-299eabd577f4" containerID="2ac33ed076a45fa5b5a5d57c6c489740182513a02e91f53e04cfcd3c35bd879f" exitCode=0 Mar 08 06:20:02 crc kubenswrapper[4717]: I0308 06:20:02.737342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" event={"ID":"5a3959ef-9a09-4399-9a35-299eabd577f4","Type":"ContainerDied","Data":"2ac33ed076a45fa5b5a5d57c6c489740182513a02e91f53e04cfcd3c35bd879f"} Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.176662 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.222493 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wtpg\" (UniqueName: \"kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg\") pod \"5a3959ef-9a09-4399-9a35-299eabd577f4\" (UID: \"5a3959ef-9a09-4399-9a35-299eabd577f4\") " Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.228864 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg" (OuterVolumeSpecName: "kube-api-access-8wtpg") pod "5a3959ef-9a09-4399-9a35-299eabd577f4" (UID: "5a3959ef-9a09-4399-9a35-299eabd577f4"). InnerVolumeSpecName "kube-api-access-8wtpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.325316 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wtpg\" (UniqueName: \"kubernetes.io/projected/5a3959ef-9a09-4399-9a35-299eabd577f4-kube-api-access-8wtpg\") on node \"crc\" DevicePath \"\"" Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.760847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" event={"ID":"5a3959ef-9a09-4399-9a35-299eabd577f4","Type":"ContainerDied","Data":"ddf604f8333d6341f1d6503f44254c9deaeab3225a3e7ca5dd00b8a0e93126b7"} Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.761285 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf604f8333d6341f1d6503f44254c9deaeab3225a3e7ca5dd00b8a0e93126b7" Mar 08 06:20:04 crc kubenswrapper[4717]: I0308 06:20:04.760923 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549180-mnjdx" Mar 08 06:20:05 crc kubenswrapper[4717]: I0308 06:20:05.270191 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549174-wmnzs"] Mar 08 06:20:05 crc kubenswrapper[4717]: I0308 06:20:05.285258 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549174-wmnzs"] Mar 08 06:20:05 crc kubenswrapper[4717]: I0308 06:20:05.805861 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75f0624-bcc6-4e55-90f2-f9bae2782042" path="/var/lib/kubelet/pods/e75f0624-bcc6-4e55-90f2-f9bae2782042/volumes" Mar 08 06:20:07 crc kubenswrapper[4717]: I0308 06:20:07.782369 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:20:07 crc kubenswrapper[4717]: E0308 06:20:07.783434 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:20:21 crc kubenswrapper[4717]: I0308 06:20:21.782032 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:20:21 crc kubenswrapper[4717]: E0308 06:20:21.783123 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:20:35 crc kubenswrapper[4717]: I0308 06:20:35.781342 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:20:35 crc kubenswrapper[4717]: E0308 06:20:35.782145 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:20:37 crc kubenswrapper[4717]: I0308 06:20:37.416238 4717 scope.go:117] "RemoveContainer" containerID="34eb5ea1c4ffc90b477f6f6c68dd3d62bcf69b3f31a89d25d4089a024fa8137d" Mar 08 06:20:49 crc kubenswrapper[4717]: I0308 06:20:49.782545 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:20:49 crc kubenswrapper[4717]: E0308 06:20:49.783776 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:21:01 crc kubenswrapper[4717]: I0308 06:21:01.781248 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:21:01 crc kubenswrapper[4717]: E0308 06:21:01.781905 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.161157 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:03 crc kubenswrapper[4717]: E0308 06:21:03.161982 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3959ef-9a09-4399-9a35-299eabd577f4" containerName="oc" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.162000 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3959ef-9a09-4399-9a35-299eabd577f4" containerName="oc" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.165859 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3959ef-9a09-4399-9a35-299eabd577f4" containerName="oc" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.174074 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.188705 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.302815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.302969 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.303027 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8926w\" (UniqueName: \"kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.405426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.405540 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.405594 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8926w\" (UniqueName: \"kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.406056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.406138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.428375 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8926w\" (UniqueName: \"kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w\") pod \"redhat-marketplace-fmgjj\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.503881 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:03 crc kubenswrapper[4717]: I0308 06:21:03.981813 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:04 crc kubenswrapper[4717]: I0308 06:21:04.459870 4717 generic.go:334] "Generic (PLEG): container finished" podID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerID="eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f" exitCode=0 Mar 08 06:21:04 crc kubenswrapper[4717]: I0308 06:21:04.460080 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerDied","Data":"eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f"} Mar 08 06:21:04 crc kubenswrapper[4717]: I0308 06:21:04.460175 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerStarted","Data":"744732732f77a65b3ecc7420d19887a459be622fedd334c71251fb3799c78147"} Mar 08 06:21:04 crc kubenswrapper[4717]: I0308 06:21:04.462765 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:21:06 crc kubenswrapper[4717]: I0308 06:21:06.486292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerStarted","Data":"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6"} Mar 08 06:21:07 crc kubenswrapper[4717]: I0308 06:21:07.506653 4717 generic.go:334] "Generic (PLEG): container finished" podID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerID="5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6" exitCode=0 Mar 08 06:21:07 crc kubenswrapper[4717]: I0308 06:21:07.506925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerDied","Data":"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6"} Mar 08 06:21:08 crc kubenswrapper[4717]: I0308 06:21:08.517861 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerStarted","Data":"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0"} Mar 08 06:21:08 crc kubenswrapper[4717]: I0308 06:21:08.540033 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fmgjj" podStartSLOduration=2.095011378 podStartE2EDuration="5.540015578s" podCreationTimestamp="2026-03-08 06:21:03 +0000 UTC" firstStartedPulling="2026-03-08 06:21:04.462524713 +0000 UTC m=+3291.380173547" lastFinishedPulling="2026-03-08 06:21:07.907528883 +0000 UTC m=+3294.825177747" observedRunningTime="2026-03-08 06:21:08.534300827 +0000 UTC m=+3295.451949691" watchObservedRunningTime="2026-03-08 06:21:08.540015578 +0000 UTC m=+3295.457664432" Mar 08 06:21:13 crc kubenswrapper[4717]: I0308 06:21:13.504867 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:13 crc kubenswrapper[4717]: I0308 06:21:13.505385 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:13 crc kubenswrapper[4717]: I0308 06:21:13.596343 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:13 crc kubenswrapper[4717]: I0308 06:21:13.680133 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:13 crc kubenswrapper[4717]: I0308 06:21:13.841467 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:15 crc kubenswrapper[4717]: I0308 06:21:15.606580 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fmgjj" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="registry-server" containerID="cri-o://7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0" gracePeriod=2 Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.105273 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.206451 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8926w\" (UniqueName: \"kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w\") pod \"5085ef5c-3f72-426f-bac3-df5134c7bea8\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.206566 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content\") pod \"5085ef5c-3f72-426f-bac3-df5134c7bea8\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.206915 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities\") pod \"5085ef5c-3f72-426f-bac3-df5134c7bea8\" (UID: \"5085ef5c-3f72-426f-bac3-df5134c7bea8\") " Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.208256 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities" (OuterVolumeSpecName: "utilities") pod "5085ef5c-3f72-426f-bac3-df5134c7bea8" (UID: "5085ef5c-3f72-426f-bac3-df5134c7bea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.215915 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w" (OuterVolumeSpecName: "kube-api-access-8926w") pod "5085ef5c-3f72-426f-bac3-df5134c7bea8" (UID: "5085ef5c-3f72-426f-bac3-df5134c7bea8"). InnerVolumeSpecName "kube-api-access-8926w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.253372 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5085ef5c-3f72-426f-bac3-df5134c7bea8" (UID: "5085ef5c-3f72-426f-bac3-df5134c7bea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.309993 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8926w\" (UniqueName: \"kubernetes.io/projected/5085ef5c-3f72-426f-bac3-df5134c7bea8-kube-api-access-8926w\") on node \"crc\" DevicePath \"\"" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.310028 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.310042 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5085ef5c-3f72-426f-bac3-df5134c7bea8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.628521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fmgjj" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.628570 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerDied","Data":"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0"} Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.629054 4717 scope.go:117] "RemoveContainer" containerID="7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.628378 4717 generic.go:334] "Generic (PLEG): container finished" podID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerID="7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0" exitCode=0 Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.638872 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fmgjj" event={"ID":"5085ef5c-3f72-426f-bac3-df5134c7bea8","Type":"ContainerDied","Data":"744732732f77a65b3ecc7420d19887a459be622fedd334c71251fb3799c78147"} Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.673897 4717 scope.go:117] "RemoveContainer" containerID="5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.703446 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.714952 4717 scope.go:117] "RemoveContainer" containerID="eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.718192 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fmgjj"] Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.765315 4717 scope.go:117] "RemoveContainer" containerID="7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0" Mar 08 06:21:16 crc kubenswrapper[4717]: E0308 06:21:16.766153 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0\": container with ID starting with 7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0 not found: ID does not exist" containerID="7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.766231 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0"} err="failed to get container status \"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0\": rpc error: code = NotFound desc = could not find container \"7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0\": container with ID starting with 7dc9d12df3bd3de85c7d2e835417cae5898b07e7b6c1be73d7c42f8b025a59d0 not found: ID does not exist" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.766277 4717 scope.go:117] "RemoveContainer" containerID="5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6" Mar 08 06:21:16 crc kubenswrapper[4717]: E0308 06:21:16.767018 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6\": container with ID starting with 5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6 not found: ID does not exist" containerID="5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.767087 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6"} err="failed to get container status \"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6\": rpc error: code = NotFound desc = could not find container \"5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6\": container with ID starting with 5d7bc49b5dada7d9876608db4d29425a02d1c24f7ce1a66e5bd33ef5dece13b6 not found: ID does not exist" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.767129 4717 scope.go:117] "RemoveContainer" containerID="eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f" Mar 08 06:21:16 crc kubenswrapper[4717]: E0308 06:21:16.767652 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f\": container with ID starting with eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f not found: ID does not exist" containerID="eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.767749 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f"} err="failed to get container status \"eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f\": rpc error: code = NotFound desc = could not find container \"eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f\": container with ID starting with eeba8005451a3f3a0d2f89f65aa6859cb5168f8f5f91886f6511fcc51b08fa3f not found: ID does not exist" Mar 08 06:21:16 crc kubenswrapper[4717]: I0308 06:21:16.782062 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:21:17 crc kubenswrapper[4717]: I0308 06:21:17.660838 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc"} Mar 08 06:21:17 crc kubenswrapper[4717]: I0308 06:21:17.800140 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" path="/var/lib/kubelet/pods/5085ef5c-3f72-426f-bac3-df5134c7bea8/volumes" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.215764 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549182-srx98"] Mar 08 06:22:00 crc kubenswrapper[4717]: E0308 06:22:00.217446 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="extract-content" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.217481 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="extract-content" Mar 08 06:22:00 crc kubenswrapper[4717]: E0308 06:22:00.217508 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="extract-utilities" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.217528 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="extract-utilities" Mar 08 06:22:00 crc kubenswrapper[4717]: E0308 06:22:00.217597 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="registry-server" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.217619 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="registry-server" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.218155 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5085ef5c-3f72-426f-bac3-df5134c7bea8" containerName="registry-server" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.219770 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.222700 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.223608 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.226056 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.245197 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549182-srx98"] Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.302798 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsq77\" (UniqueName: \"kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77\") pod \"auto-csr-approver-29549182-srx98\" (UID: \"d2839767-7bc6-4fd5-8254-d68ad53c4a6e\") " pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.405202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsq77\" (UniqueName: \"kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77\") pod \"auto-csr-approver-29549182-srx98\" (UID: \"d2839767-7bc6-4fd5-8254-d68ad53c4a6e\") " pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.431755 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsq77\" (UniqueName: \"kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77\") pod \"auto-csr-approver-29549182-srx98\" (UID: \"d2839767-7bc6-4fd5-8254-d68ad53c4a6e\") " pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:00 crc kubenswrapper[4717]: I0308 06:22:00.548613 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:01 crc kubenswrapper[4717]: I0308 06:22:01.120798 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549182-srx98"] Mar 08 06:22:01 crc kubenswrapper[4717]: I0308 06:22:01.361307 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549182-srx98" event={"ID":"d2839767-7bc6-4fd5-8254-d68ad53c4a6e","Type":"ContainerStarted","Data":"0824015d51e51673eb7c69b5b6a7fb2e4acde228f48e05eee9d53a52fb28096b"} Mar 08 06:22:02 crc kubenswrapper[4717]: I0308 06:22:02.373828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549182-srx98" event={"ID":"d2839767-7bc6-4fd5-8254-d68ad53c4a6e","Type":"ContainerStarted","Data":"3daccea532945de186a6c832ba1e4d0bd2ae02f5cbde6f8e881dd787950464d9"} Mar 08 06:22:02 crc kubenswrapper[4717]: I0308 06:22:02.392925 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549182-srx98" podStartSLOduration=1.524259102 podStartE2EDuration="2.392904256s" podCreationTimestamp="2026-03-08 06:22:00 +0000 UTC" firstStartedPulling="2026-03-08 06:22:01.09137603 +0000 UTC m=+3348.009024884" lastFinishedPulling="2026-03-08 06:22:01.960021184 +0000 UTC m=+3348.877670038" observedRunningTime="2026-03-08 06:22:02.390272501 +0000 UTC m=+3349.307921375" watchObservedRunningTime="2026-03-08 06:22:02.392904256 +0000 UTC m=+3349.310553130" Mar 08 06:22:03 crc kubenswrapper[4717]: I0308 06:22:03.383633 4717 generic.go:334] "Generic (PLEG): container finished" podID="d2839767-7bc6-4fd5-8254-d68ad53c4a6e" containerID="3daccea532945de186a6c832ba1e4d0bd2ae02f5cbde6f8e881dd787950464d9" exitCode=0 Mar 08 06:22:03 crc kubenswrapper[4717]: I0308 06:22:03.383719 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549182-srx98" event={"ID":"d2839767-7bc6-4fd5-8254-d68ad53c4a6e","Type":"ContainerDied","Data":"3daccea532945de186a6c832ba1e4d0bd2ae02f5cbde6f8e881dd787950464d9"} Mar 08 06:22:04 crc kubenswrapper[4717]: I0308 06:22:04.853444 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:04 crc kubenswrapper[4717]: I0308 06:22:04.920311 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsq77\" (UniqueName: \"kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77\") pod \"d2839767-7bc6-4fd5-8254-d68ad53c4a6e\" (UID: \"d2839767-7bc6-4fd5-8254-d68ad53c4a6e\") " Mar 08 06:22:04 crc kubenswrapper[4717]: I0308 06:22:04.928024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77" (OuterVolumeSpecName: "kube-api-access-qsq77") pod "d2839767-7bc6-4fd5-8254-d68ad53c4a6e" (UID: "d2839767-7bc6-4fd5-8254-d68ad53c4a6e"). InnerVolumeSpecName "kube-api-access-qsq77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.026238 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsq77\" (UniqueName: \"kubernetes.io/projected/d2839767-7bc6-4fd5-8254-d68ad53c4a6e-kube-api-access-qsq77\") on node \"crc\" DevicePath \"\"" Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.412493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549182-srx98" event={"ID":"d2839767-7bc6-4fd5-8254-d68ad53c4a6e","Type":"ContainerDied","Data":"0824015d51e51673eb7c69b5b6a7fb2e4acde228f48e05eee9d53a52fb28096b"} Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.412600 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549182-srx98" Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.412606 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0824015d51e51673eb7c69b5b6a7fb2e4acde228f48e05eee9d53a52fb28096b" Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.489484 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549176-dkztq"] Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.509358 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549176-dkztq"] Mar 08 06:22:05 crc kubenswrapper[4717]: I0308 06:22:05.793460 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a0fb95-6ecb-4e05-9585-ab3757f20000" path="/var/lib/kubelet/pods/92a0fb95-6ecb-4e05-9585-ab3757f20000/volumes" Mar 08 06:22:37 crc kubenswrapper[4717]: I0308 06:22:37.569734 4717 scope.go:117] "RemoveContainer" containerID="ba05100014f43c0d4cf347d295d68bdfc53996477cd7c5e07b0ca278cef47991" Mar 08 06:23:34 crc kubenswrapper[4717]: I0308 06:23:34.119723 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:23:34 crc kubenswrapper[4717]: I0308 06:23:34.120355 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.153608 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549184-fbdcm"] Mar 08 06:24:00 crc kubenswrapper[4717]: E0308 06:24:00.157025 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2839767-7bc6-4fd5-8254-d68ad53c4a6e" containerName="oc" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.157043 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2839767-7bc6-4fd5-8254-d68ad53c4a6e" containerName="oc" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.157301 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2839767-7bc6-4fd5-8254-d68ad53c4a6e" containerName="oc" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.158098 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.162281 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.162330 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.162625 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.167863 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549184-fbdcm"] Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.346581 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qx5\" (UniqueName: \"kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5\") pod \"auto-csr-approver-29549184-fbdcm\" (UID: \"1718a93f-8cc0-4b53-8864-5c1d6f56b849\") " pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.450056 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qx5\" (UniqueName: \"kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5\") pod \"auto-csr-approver-29549184-fbdcm\" (UID: \"1718a93f-8cc0-4b53-8864-5c1d6f56b849\") " pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.475440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qx5\" (UniqueName: \"kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5\") pod \"auto-csr-approver-29549184-fbdcm\" (UID: \"1718a93f-8cc0-4b53-8864-5c1d6f56b849\") " pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:00 crc kubenswrapper[4717]: I0308 06:24:00.509543 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:01 crc kubenswrapper[4717]: I0308 06:24:01.077445 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549184-fbdcm"] Mar 08 06:24:01 crc kubenswrapper[4717]: I0308 06:24:01.852531 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" event={"ID":"1718a93f-8cc0-4b53-8864-5c1d6f56b849","Type":"ContainerStarted","Data":"4ea7eb4134a68db241c84dc97edc637949215f4ae759847c203bb73e79280f55"} Mar 08 06:24:02 crc kubenswrapper[4717]: I0308 06:24:02.900180 4717 generic.go:334] "Generic (PLEG): container finished" podID="1718a93f-8cc0-4b53-8864-5c1d6f56b849" containerID="59d84a015c00b07037c34968e6fe0cf0d1ffc7a8b0604c9e8fd450737d454719" exitCode=0 Mar 08 06:24:02 crc kubenswrapper[4717]: I0308 06:24:02.900275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" event={"ID":"1718a93f-8cc0-4b53-8864-5c1d6f56b849","Type":"ContainerDied","Data":"59d84a015c00b07037c34968e6fe0cf0d1ffc7a8b0604c9e8fd450737d454719"} Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.122150 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.122488 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.402185 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.555334 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5qx5\" (UniqueName: \"kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5\") pod \"1718a93f-8cc0-4b53-8864-5c1d6f56b849\" (UID: \"1718a93f-8cc0-4b53-8864-5c1d6f56b849\") " Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.569076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5" (OuterVolumeSpecName: "kube-api-access-f5qx5") pod "1718a93f-8cc0-4b53-8864-5c1d6f56b849" (UID: "1718a93f-8cc0-4b53-8864-5c1d6f56b849"). InnerVolumeSpecName "kube-api-access-f5qx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.658113 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5qx5\" (UniqueName: \"kubernetes.io/projected/1718a93f-8cc0-4b53-8864-5c1d6f56b849-kube-api-access-f5qx5\") on node \"crc\" DevicePath \"\"" Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.937465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" event={"ID":"1718a93f-8cc0-4b53-8864-5c1d6f56b849","Type":"ContainerDied","Data":"4ea7eb4134a68db241c84dc97edc637949215f4ae759847c203bb73e79280f55"} Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.937901 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ea7eb4134a68db241c84dc97edc637949215f4ae759847c203bb73e79280f55" Mar 08 06:24:04 crc kubenswrapper[4717]: I0308 06:24:04.937772 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549184-fbdcm" Mar 08 06:24:05 crc kubenswrapper[4717]: I0308 06:24:05.487238 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549178-zphcw"] Mar 08 06:24:05 crc kubenswrapper[4717]: I0308 06:24:05.497869 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549178-zphcw"] Mar 08 06:24:05 crc kubenswrapper[4717]: I0308 06:24:05.797854 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d940c69-ce7d-45dc-be11-463c7e8b0b9e" path="/var/lib/kubelet/pods/2d940c69-ce7d-45dc-be11-463c7e8b0b9e/volumes" Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.119587 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.120411 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.120482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.121848 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.121966 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc" gracePeriod=600 Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.279458 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc" exitCode=0 Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.279568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc"} Mar 08 06:24:34 crc kubenswrapper[4717]: I0308 06:24:34.279993 4717 scope.go:117] "RemoveContainer" containerID="77dc73d85898a59acdc247c5fbfeea53d0db61d50c404353cfcf69b2d0b3554f" Mar 08 06:24:35 crc kubenswrapper[4717]: I0308 06:24:35.318438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1"} Mar 08 06:24:37 crc kubenswrapper[4717]: I0308 06:24:37.695946 4717 scope.go:117] "RemoveContainer" containerID="82f863aa82489fe42508f6dd874792d823b6fc89b2c6c2be4aac047c85318af4" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.427175 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fpcgg"] Mar 08 06:24:54 crc kubenswrapper[4717]: E0308 06:24:54.428224 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1718a93f-8cc0-4b53-8864-5c1d6f56b849" containerName="oc" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.428240 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1718a93f-8cc0-4b53-8864-5c1d6f56b849" containerName="oc" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.428476 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1718a93f-8cc0-4b53-8864-5c1d6f56b849" containerName="oc" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.430284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.450034 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpcgg"] Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.529035 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-catalog-content\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.529094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qv62\" (UniqueName: \"kubernetes.io/projected/4e9cba03-4895-49ad-ae18-c6d5ebd55311-kube-api-access-8qv62\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.529186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-utilities\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.631241 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-catalog-content\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.631293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qv62\" (UniqueName: \"kubernetes.io/projected/4e9cba03-4895-49ad-ae18-c6d5ebd55311-kube-api-access-8qv62\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.631351 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-utilities\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.631909 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-utilities\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.631965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9cba03-4895-49ad-ae18-c6d5ebd55311-catalog-content\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.659225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qv62\" (UniqueName: \"kubernetes.io/projected/4e9cba03-4895-49ad-ae18-c6d5ebd55311-kube-api-access-8qv62\") pod \"community-operators-fpcgg\" (UID: \"4e9cba03-4895-49ad-ae18-c6d5ebd55311\") " pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:54 crc kubenswrapper[4717]: I0308 06:24:54.764621 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:24:55 crc kubenswrapper[4717]: I0308 06:24:55.269533 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpcgg"] Mar 08 06:24:55 crc kubenswrapper[4717]: I0308 06:24:55.565142 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e9cba03-4895-49ad-ae18-c6d5ebd55311" containerID="3eadec70a9567ff9e54057ad85c8417da43b96c8071750f75f9fda61f21e5309" exitCode=0 Mar 08 06:24:55 crc kubenswrapper[4717]: I0308 06:24:55.565208 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpcgg" event={"ID":"4e9cba03-4895-49ad-ae18-c6d5ebd55311","Type":"ContainerDied","Data":"3eadec70a9567ff9e54057ad85c8417da43b96c8071750f75f9fda61f21e5309"} Mar 08 06:24:55 crc kubenswrapper[4717]: I0308 06:24:55.565281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpcgg" event={"ID":"4e9cba03-4895-49ad-ae18-c6d5ebd55311","Type":"ContainerStarted","Data":"ede14353893f547e57e5a814610d767ffdf35bbbf888fd2e3c14f2fed9817970"} Mar 08 06:25:00 crc kubenswrapper[4717]: I0308 06:25:00.620254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpcgg" event={"ID":"4e9cba03-4895-49ad-ae18-c6d5ebd55311","Type":"ContainerStarted","Data":"7b7a663655f61fede98f064e0f5b124089f970b079801858c5c4e8d63b74e92f"} Mar 08 06:25:01 crc kubenswrapper[4717]: I0308 06:25:01.635880 4717 generic.go:334] "Generic (PLEG): container finished" podID="4e9cba03-4895-49ad-ae18-c6d5ebd55311" containerID="7b7a663655f61fede98f064e0f5b124089f970b079801858c5c4e8d63b74e92f" exitCode=0 Mar 08 06:25:01 crc kubenswrapper[4717]: I0308 06:25:01.636035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpcgg" event={"ID":"4e9cba03-4895-49ad-ae18-c6d5ebd55311","Type":"ContainerDied","Data":"7b7a663655f61fede98f064e0f5b124089f970b079801858c5c4e8d63b74e92f"} Mar 08 06:25:02 crc kubenswrapper[4717]: I0308 06:25:02.651918 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpcgg" event={"ID":"4e9cba03-4895-49ad-ae18-c6d5ebd55311","Type":"ContainerStarted","Data":"817ce71f96f6539c2430a5bb55b479fe6a3eb566a93f837c48938e6a7f7772d4"} Mar 08 06:25:02 crc kubenswrapper[4717]: I0308 06:25:02.688455 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fpcgg" podStartSLOduration=2.218220356 podStartE2EDuration="8.688433457s" podCreationTimestamp="2026-03-08 06:24:54 +0000 UTC" firstStartedPulling="2026-03-08 06:24:55.570878367 +0000 UTC m=+3522.488527261" lastFinishedPulling="2026-03-08 06:25:02.041091478 +0000 UTC m=+3528.958740362" observedRunningTime="2026-03-08 06:25:02.673557121 +0000 UTC m=+3529.591205995" watchObservedRunningTime="2026-03-08 06:25:02.688433457 +0000 UTC m=+3529.606082301" Mar 08 06:25:04 crc kubenswrapper[4717]: I0308 06:25:04.765458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:25:04 crc kubenswrapper[4717]: I0308 06:25:04.766192 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:25:04 crc kubenswrapper[4717]: I0308 06:25:04.844329 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:25:14 crc kubenswrapper[4717]: I0308 06:25:14.853637 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fpcgg" Mar 08 06:25:14 crc kubenswrapper[4717]: I0308 06:25:14.961231 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpcgg"] Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.014959 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.015668 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gzsvv" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="registry-server" containerID="cri-o://4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b" gracePeriod=2 Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.525632 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.640650 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities\") pod \"9e076e50-edc7-4172-bb69-ca35340a0f0b\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.640807 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content\") pod \"9e076e50-edc7-4172-bb69-ca35340a0f0b\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.640829 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbkv8\" (UniqueName: \"kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8\") pod \"9e076e50-edc7-4172-bb69-ca35340a0f0b\" (UID: \"9e076e50-edc7-4172-bb69-ca35340a0f0b\") " Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.642450 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities" (OuterVolumeSpecName: "utilities") pod "9e076e50-edc7-4172-bb69-ca35340a0f0b" (UID: "9e076e50-edc7-4172-bb69-ca35340a0f0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.658013 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8" (OuterVolumeSpecName: "kube-api-access-vbkv8") pod "9e076e50-edc7-4172-bb69-ca35340a0f0b" (UID: "9e076e50-edc7-4172-bb69-ca35340a0f0b"). InnerVolumeSpecName "kube-api-access-vbkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.699121 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e076e50-edc7-4172-bb69-ca35340a0f0b" (UID: "9e076e50-edc7-4172-bb69-ca35340a0f0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.743009 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.743040 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e076e50-edc7-4172-bb69-ca35340a0f0b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.743052 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbkv8\" (UniqueName: \"kubernetes.io/projected/9e076e50-edc7-4172-bb69-ca35340a0f0b-kube-api-access-vbkv8\") on node \"crc\" DevicePath \"\"" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.831759 4717 generic.go:334] "Generic (PLEG): container finished" podID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerID="4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b" exitCode=0 Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.831829 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gzsvv" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.831858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerDied","Data":"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b"} Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.831907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gzsvv" event={"ID":"9e076e50-edc7-4172-bb69-ca35340a0f0b","Type":"ContainerDied","Data":"7447a01ddc2f0251fe2055220a7e475560a6daf7ea34ff7aee478753b96c772d"} Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.831933 4717 scope.go:117] "RemoveContainer" containerID="4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.879433 4717 scope.go:117] "RemoveContainer" containerID="50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.911238 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.921908 4717 scope.go:117] "RemoveContainer" containerID="203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.922448 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gzsvv"] Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.955714 4717 scope.go:117] "RemoveContainer" containerID="4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b" Mar 08 06:25:15 crc kubenswrapper[4717]: E0308 06:25:15.956999 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b\": container with ID starting with 4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b not found: ID does not exist" containerID="4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.957054 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b"} err="failed to get container status \"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b\": rpc error: code = NotFound desc = could not find container \"4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b\": container with ID starting with 4028296f17b8efcc9516f1e45a9e306290c308015850c12cf2347169bd76d18b not found: ID does not exist" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.957083 4717 scope.go:117] "RemoveContainer" containerID="50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617" Mar 08 06:25:15 crc kubenswrapper[4717]: E0308 06:25:15.957443 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617\": container with ID starting with 50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617 not found: ID does not exist" containerID="50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.957478 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617"} err="failed to get container status \"50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617\": rpc error: code = NotFound desc = could not find container \"50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617\": container with ID starting with 50b73cd09804998e9f606ca15d39fb7b8469929b9fa80612e2d25f8f31a94617 not found: ID does not exist" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.957500 4717 scope.go:117] "RemoveContainer" containerID="203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e" Mar 08 06:25:15 crc kubenswrapper[4717]: E0308 06:25:15.957780 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e\": container with ID starting with 203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e not found: ID does not exist" containerID="203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e" Mar 08 06:25:15 crc kubenswrapper[4717]: I0308 06:25:15.957809 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e"} err="failed to get container status \"203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e\": rpc error: code = NotFound desc = could not find container \"203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e\": container with ID starting with 203d83cc60d68fc18494144cfc5a88f81636c0eb0b123fb133c9d4433cac216e not found: ID does not exist" Mar 08 06:25:17 crc kubenswrapper[4717]: I0308 06:25:17.795323 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" path="/var/lib/kubelet/pods/9e076e50-edc7-4172-bb69-ca35340a0f0b/volumes" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.168041 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549186-7rn67"] Mar 08 06:26:00 crc kubenswrapper[4717]: E0308 06:26:00.168947 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="registry-server" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.168961 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="registry-server" Mar 08 06:26:00 crc kubenswrapper[4717]: E0308 06:26:00.168979 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="extract-utilities" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.168985 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="extract-utilities" Mar 08 06:26:00 crc kubenswrapper[4717]: E0308 06:26:00.169000 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="extract-content" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.169006 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="extract-content" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.169177 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e076e50-edc7-4172-bb69-ca35340a0f0b" containerName="registry-server" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.169823 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.172099 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.173275 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.177707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549186-7rn67"] Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.181507 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.338505 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx4p\" (UniqueName: \"kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p\") pod \"auto-csr-approver-29549186-7rn67\" (UID: \"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca\") " pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.441916 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx4p\" (UniqueName: \"kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p\") pod \"auto-csr-approver-29549186-7rn67\" (UID: \"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca\") " pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.478630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx4p\" (UniqueName: \"kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p\") pod \"auto-csr-approver-29549186-7rn67\" (UID: \"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca\") " pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:00 crc kubenswrapper[4717]: I0308 06:26:00.503325 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:01 crc kubenswrapper[4717]: I0308 06:26:01.023873 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549186-7rn67"] Mar 08 06:26:01 crc kubenswrapper[4717]: I0308 06:26:01.333289 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549186-7rn67" event={"ID":"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca","Type":"ContainerStarted","Data":"02b99e6593e961fff5eb6cfc63d0244384cc8ac65cb67f944b565f6ca56351d4"} Mar 08 06:26:02 crc kubenswrapper[4717]: I0308 06:26:02.343576 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549186-7rn67" event={"ID":"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca","Type":"ContainerStarted","Data":"387a5454d70eb7d06feb115829ee34d0fa220cee3d8bc4b4ee1c011f4101bd1a"} Mar 08 06:26:02 crc kubenswrapper[4717]: I0308 06:26:02.362418 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549186-7rn67" podStartSLOduration=1.454990077 podStartE2EDuration="2.362394133s" podCreationTimestamp="2026-03-08 06:26:00 +0000 UTC" firstStartedPulling="2026-03-08 06:26:01.012274523 +0000 UTC m=+3587.929923397" lastFinishedPulling="2026-03-08 06:26:01.919678609 +0000 UTC m=+3588.837327453" observedRunningTime="2026-03-08 06:26:02.359949923 +0000 UTC m=+3589.277598767" watchObservedRunningTime="2026-03-08 06:26:02.362394133 +0000 UTC m=+3589.280042987" Mar 08 06:26:03 crc kubenswrapper[4717]: I0308 06:26:03.371301 4717 generic.go:334] "Generic (PLEG): container finished" podID="a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" containerID="387a5454d70eb7d06feb115829ee34d0fa220cee3d8bc4b4ee1c011f4101bd1a" exitCode=0 Mar 08 06:26:03 crc kubenswrapper[4717]: I0308 06:26:03.371375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549186-7rn67" event={"ID":"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca","Type":"ContainerDied","Data":"387a5454d70eb7d06feb115829ee34d0fa220cee3d8bc4b4ee1c011f4101bd1a"} Mar 08 06:26:04 crc kubenswrapper[4717]: I0308 06:26:04.797350 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:04 crc kubenswrapper[4717]: I0308 06:26:04.850213 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbx4p\" (UniqueName: \"kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p\") pod \"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca\" (UID: \"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca\") " Mar 08 06:26:04 crc kubenswrapper[4717]: I0308 06:26:04.874117 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p" (OuterVolumeSpecName: "kube-api-access-nbx4p") pod "a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" (UID: "a2aa1a72-2ad2-4384-bd9d-a078e04c03ca"). InnerVolumeSpecName "kube-api-access-nbx4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:26:04 crc kubenswrapper[4717]: I0308 06:26:04.951397 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbx4p\" (UniqueName: \"kubernetes.io/projected/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca-kube-api-access-nbx4p\") on node \"crc\" DevicePath \"\"" Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.406249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549186-7rn67" event={"ID":"a2aa1a72-2ad2-4384-bd9d-a078e04c03ca","Type":"ContainerDied","Data":"02b99e6593e961fff5eb6cfc63d0244384cc8ac65cb67f944b565f6ca56351d4"} Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.406768 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02b99e6593e961fff5eb6cfc63d0244384cc8ac65cb67f944b565f6ca56351d4" Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.406382 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549186-7rn67" Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.523614 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549180-mnjdx"] Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.541035 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549180-mnjdx"] Mar 08 06:26:05 crc kubenswrapper[4717]: I0308 06:26:05.805397 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3959ef-9a09-4399-9a35-299eabd577f4" path="/var/lib/kubelet/pods/5a3959ef-9a09-4399-9a35-299eabd577f4/volumes" Mar 08 06:26:34 crc kubenswrapper[4717]: I0308 06:26:34.120198 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:26:34 crc kubenswrapper[4717]: I0308 06:26:34.120856 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:26:37 crc kubenswrapper[4717]: I0308 06:26:37.842141 4717 scope.go:117] "RemoveContainer" containerID="2ac33ed076a45fa5b5a5d57c6c489740182513a02e91f53e04cfcd3c35bd879f" Mar 08 06:27:04 crc kubenswrapper[4717]: I0308 06:27:04.119848 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:27:04 crc kubenswrapper[4717]: I0308 06:27:04.120673 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.796837 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:07 crc kubenswrapper[4717]: E0308 06:27:07.798374 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" containerName="oc" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.798411 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" containerName="oc" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.799002 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" containerName="oc" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.802600 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.808458 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.879864 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62tq\" (UniqueName: \"kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.880156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.880295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.984972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q62tq\" (UniqueName: \"kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.985066 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.985244 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.985746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:07 crc kubenswrapper[4717]: I0308 06:27:07.985903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:08 crc kubenswrapper[4717]: I0308 06:27:08.004742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62tq\" (UniqueName: \"kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq\") pod \"certified-operators-g2rkq\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:08 crc kubenswrapper[4717]: I0308 06:27:08.129295 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:08 crc kubenswrapper[4717]: I0308 06:27:08.713222 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:09 crc kubenswrapper[4717]: I0308 06:27:09.097236 4717 generic.go:334] "Generic (PLEG): container finished" podID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerID="a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e" exitCode=0 Mar 08 06:27:09 crc kubenswrapper[4717]: I0308 06:27:09.097290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerDied","Data":"a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e"} Mar 08 06:27:09 crc kubenswrapper[4717]: I0308 06:27:09.098199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerStarted","Data":"80adc5e18256017a2bd2cbac43ad463a358a6bb251198acc0d256c04793ae37d"} Mar 08 06:27:09 crc kubenswrapper[4717]: I0308 06:27:09.099516 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:27:10 crc kubenswrapper[4717]: I0308 06:27:10.116190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerStarted","Data":"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806"} Mar 08 06:27:11 crc kubenswrapper[4717]: E0308 06:27:11.736405 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44cbc2a5_75d5_431c_a686_8a7e1b097f87.slice/crio-23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806.scope\": RecentStats: unable to find data in memory cache]" Mar 08 06:27:12 crc kubenswrapper[4717]: I0308 06:27:12.144152 4717 generic.go:334] "Generic (PLEG): container finished" podID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerID="23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806" exitCode=0 Mar 08 06:27:12 crc kubenswrapper[4717]: I0308 06:27:12.144673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerDied","Data":"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806"} Mar 08 06:27:13 crc kubenswrapper[4717]: I0308 06:27:13.163822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerStarted","Data":"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25"} Mar 08 06:27:13 crc kubenswrapper[4717]: I0308 06:27:13.192108 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g2rkq" podStartSLOduration=2.763313776 podStartE2EDuration="6.192089028s" podCreationTimestamp="2026-03-08 06:27:07 +0000 UTC" firstStartedPulling="2026-03-08 06:27:09.099171414 +0000 UTC m=+3656.016820258" lastFinishedPulling="2026-03-08 06:27:12.527946626 +0000 UTC m=+3659.445595510" observedRunningTime="2026-03-08 06:27:13.185811694 +0000 UTC m=+3660.103460548" watchObservedRunningTime="2026-03-08 06:27:13.192089028 +0000 UTC m=+3660.109737882" Mar 08 06:27:18 crc kubenswrapper[4717]: I0308 06:27:18.130714 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:18 crc kubenswrapper[4717]: I0308 06:27:18.131311 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:18 crc kubenswrapper[4717]: I0308 06:27:18.204710 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:18 crc kubenswrapper[4717]: I0308 06:27:18.310751 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:18 crc kubenswrapper[4717]: I0308 06:27:18.458228 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.252480 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g2rkq" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="registry-server" containerID="cri-o://34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25" gracePeriod=2 Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.825400 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.916591 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content\") pod \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.916659 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities\") pod \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.916763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q62tq\" (UniqueName: \"kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq\") pod \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\" (UID: \"44cbc2a5-75d5-431c-a686-8a7e1b097f87\") " Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.917600 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities" (OuterVolumeSpecName: "utilities") pod "44cbc2a5-75d5-431c-a686-8a7e1b097f87" (UID: "44cbc2a5-75d5-431c-a686-8a7e1b097f87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:27:20 crc kubenswrapper[4717]: I0308 06:27:20.924735 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq" (OuterVolumeSpecName: "kube-api-access-q62tq") pod "44cbc2a5-75d5-431c-a686-8a7e1b097f87" (UID: "44cbc2a5-75d5-431c-a686-8a7e1b097f87"). InnerVolumeSpecName "kube-api-access-q62tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.001974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44cbc2a5-75d5-431c-a686-8a7e1b097f87" (UID: "44cbc2a5-75d5-431c-a686-8a7e1b097f87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.019300 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.019328 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q62tq\" (UniqueName: \"kubernetes.io/projected/44cbc2a5-75d5-431c-a686-8a7e1b097f87-kube-api-access-q62tq\") on node \"crc\" DevicePath \"\"" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.019338 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44cbc2a5-75d5-431c-a686-8a7e1b097f87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.274147 4717 generic.go:334] "Generic (PLEG): container finished" podID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerID="34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25" exitCode=0 Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.274220 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerDied","Data":"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25"} Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.276288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2rkq" event={"ID":"44cbc2a5-75d5-431c-a686-8a7e1b097f87","Type":"ContainerDied","Data":"80adc5e18256017a2bd2cbac43ad463a358a6bb251198acc0d256c04793ae37d"} Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.274279 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2rkq" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.276443 4717 scope.go:117] "RemoveContainer" containerID="34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.325636 4717 scope.go:117] "RemoveContainer" containerID="23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.331947 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.343719 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g2rkq"] Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.369390 4717 scope.go:117] "RemoveContainer" containerID="a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.428734 4717 scope.go:117] "RemoveContainer" containerID="34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25" Mar 08 06:27:21 crc kubenswrapper[4717]: E0308 06:27:21.429444 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25\": container with ID starting with 34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25 not found: ID does not exist" containerID="34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.429606 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25"} err="failed to get container status \"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25\": rpc error: code = NotFound desc = could not find container \"34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25\": container with ID starting with 34f5579627013723bafe669c2817466a334d43f5c4b823b789271b18e3e55e25 not found: ID does not exist" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.429740 4717 scope.go:117] "RemoveContainer" containerID="23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806" Mar 08 06:27:21 crc kubenswrapper[4717]: E0308 06:27:21.430876 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806\": container with ID starting with 23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806 not found: ID does not exist" containerID="23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.430938 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806"} err="failed to get container status \"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806\": rpc error: code = NotFound desc = could not find container \"23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806\": container with ID starting with 23b3ae52d0621154e06b862f31e8b2ab81f4b94655538e2daea26c9e346ea806 not found: ID does not exist" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.430979 4717 scope.go:117] "RemoveContainer" containerID="a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e" Mar 08 06:27:21 crc kubenswrapper[4717]: E0308 06:27:21.431474 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e\": container with ID starting with a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e not found: ID does not exist" containerID="a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.431517 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e"} err="failed to get container status \"a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e\": rpc error: code = NotFound desc = could not find container \"a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e\": container with ID starting with a42a1e35435a081c412d11d381df12730019c4639916d5a9abc8fbf0f5b1385e not found: ID does not exist" Mar 08 06:27:21 crc kubenswrapper[4717]: I0308 06:27:21.804601 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" path="/var/lib/kubelet/pods/44cbc2a5-75d5-431c-a686-8a7e1b097f87/volumes" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.120182 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.120853 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.120913 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.121517 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.121585 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" gracePeriod=600 Mar 08 06:27:34 crc kubenswrapper[4717]: E0308 06:27:34.252622 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.452515 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" exitCode=0 Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.452585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1"} Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.452658 4717 scope.go:117] "RemoveContainer" containerID="d05f173ba54b5f927464ed4171379c1843a6f24f88a932bee340d1c2aecf97fc" Mar 08 06:27:34 crc kubenswrapper[4717]: I0308 06:27:34.453996 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:27:34 crc kubenswrapper[4717]: E0308 06:27:34.454635 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:27:49 crc kubenswrapper[4717]: I0308 06:27:49.782408 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:27:49 crc kubenswrapper[4717]: E0308 06:27:49.783651 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.152512 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549188-4pv79"] Mar 08 06:28:00 crc kubenswrapper[4717]: E0308 06:28:00.153788 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="extract-content" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.153812 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="extract-content" Mar 08 06:28:00 crc kubenswrapper[4717]: E0308 06:28:00.153858 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="extract-utilities" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.153872 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="extract-utilities" Mar 08 06:28:00 crc kubenswrapper[4717]: E0308 06:28:00.153894 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="registry-server" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.153907 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="registry-server" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.154368 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="44cbc2a5-75d5-431c-a686-8a7e1b097f87" containerName="registry-server" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.155710 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.159271 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.159316 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.159939 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.163861 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549188-4pv79"] Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.220294 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwgg\" (UniqueName: \"kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg\") pod \"auto-csr-approver-29549188-4pv79\" (UID: \"4ad59f4c-b152-4455-a5da-055f2bb76b40\") " pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.323655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdwgg\" (UniqueName: \"kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg\") pod \"auto-csr-approver-29549188-4pv79\" (UID: \"4ad59f4c-b152-4455-a5da-055f2bb76b40\") " pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.358782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdwgg\" (UniqueName: \"kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg\") pod \"auto-csr-approver-29549188-4pv79\" (UID: \"4ad59f4c-b152-4455-a5da-055f2bb76b40\") " pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.484202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:00 crc kubenswrapper[4717]: I0308 06:28:00.979630 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549188-4pv79"] Mar 08 06:28:00 crc kubenswrapper[4717]: W0308 06:28:00.995498 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ad59f4c_b152_4455_a5da_055f2bb76b40.slice/crio-81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a WatchSource:0}: Error finding container 81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a: Status 404 returned error can't find the container with id 81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a Mar 08 06:28:01 crc kubenswrapper[4717]: I0308 06:28:01.782600 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:28:01 crc kubenswrapper[4717]: E0308 06:28:01.783202 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:28:01 crc kubenswrapper[4717]: I0308 06:28:01.803922 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549188-4pv79" event={"ID":"4ad59f4c-b152-4455-a5da-055f2bb76b40","Type":"ContainerStarted","Data":"81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a"} Mar 08 06:28:02 crc kubenswrapper[4717]: I0308 06:28:02.805670 4717 generic.go:334] "Generic (PLEG): container finished" podID="4ad59f4c-b152-4455-a5da-055f2bb76b40" containerID="333402b52ff8e4f00291caf2d9639aef7c526f292a14c5872d63e9bc89b2989e" exitCode=0 Mar 08 06:28:02 crc kubenswrapper[4717]: I0308 06:28:02.805869 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549188-4pv79" event={"ID":"4ad59f4c-b152-4455-a5da-055f2bb76b40","Type":"ContainerDied","Data":"333402b52ff8e4f00291caf2d9639aef7c526f292a14c5872d63e9bc89b2989e"} Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.316724 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.418439 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdwgg\" (UniqueName: \"kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg\") pod \"4ad59f4c-b152-4455-a5da-055f2bb76b40\" (UID: \"4ad59f4c-b152-4455-a5da-055f2bb76b40\") " Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.429183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg" (OuterVolumeSpecName: "kube-api-access-vdwgg") pod "4ad59f4c-b152-4455-a5da-055f2bb76b40" (UID: "4ad59f4c-b152-4455-a5da-055f2bb76b40"). InnerVolumeSpecName "kube-api-access-vdwgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.521186 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdwgg\" (UniqueName: \"kubernetes.io/projected/4ad59f4c-b152-4455-a5da-055f2bb76b40-kube-api-access-vdwgg\") on node \"crc\" DevicePath \"\"" Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.836076 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549188-4pv79" event={"ID":"4ad59f4c-b152-4455-a5da-055f2bb76b40","Type":"ContainerDied","Data":"81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a"} Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.836126 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81d4bc67dd2bfefa48785f682337da788b44bdb95b16db5a633abfb372626c3a" Mar 08 06:28:04 crc kubenswrapper[4717]: I0308 06:28:04.836171 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549188-4pv79" Mar 08 06:28:05 crc kubenswrapper[4717]: I0308 06:28:05.418986 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549182-srx98"] Mar 08 06:28:05 crc kubenswrapper[4717]: I0308 06:28:05.437514 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549182-srx98"] Mar 08 06:28:05 crc kubenswrapper[4717]: I0308 06:28:05.805879 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2839767-7bc6-4fd5-8254-d68ad53c4a6e" path="/var/lib/kubelet/pods/d2839767-7bc6-4fd5-8254-d68ad53c4a6e/volumes" Mar 08 06:28:16 crc kubenswrapper[4717]: I0308 06:28:16.781616 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:28:16 crc kubenswrapper[4717]: E0308 06:28:16.782576 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:28:27 crc kubenswrapper[4717]: I0308 06:28:27.786651 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:28:27 crc kubenswrapper[4717]: E0308 06:28:27.787637 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:28:38 crc kubenswrapper[4717]: I0308 06:28:38.045956 4717 scope.go:117] "RemoveContainer" containerID="3daccea532945de186a6c832ba1e4d0bd2ae02f5cbde6f8e881dd787950464d9" Mar 08 06:28:39 crc kubenswrapper[4717]: I0308 06:28:39.782371 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:28:39 crc kubenswrapper[4717]: E0308 06:28:39.783678 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:28:53 crc kubenswrapper[4717]: I0308 06:28:53.797253 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:28:53 crc kubenswrapper[4717]: E0308 06:28:53.798229 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:29:06 crc kubenswrapper[4717]: I0308 06:29:06.782992 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:29:06 crc kubenswrapper[4717]: E0308 06:29:06.784097 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:29:20 crc kubenswrapper[4717]: I0308 06:29:20.782439 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:29:20 crc kubenswrapper[4717]: E0308 06:29:20.783491 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.903512 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:23 crc kubenswrapper[4717]: E0308 06:29:23.904415 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad59f4c-b152-4455-a5da-055f2bb76b40" containerName="oc" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.904432 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad59f4c-b152-4455-a5da-055f2bb76b40" containerName="oc" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.904696 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad59f4c-b152-4455-a5da-055f2bb76b40" containerName="oc" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.906473 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.917802 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.985856 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.985968 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtj2\" (UniqueName: \"kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:23 crc kubenswrapper[4717]: I0308 06:29:23.986676 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.087986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.088104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.088161 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtj2\" (UniqueName: \"kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.088815 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.089017 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.109936 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtj2\" (UniqueName: \"kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2\") pod \"redhat-operators-pz6fx\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.235121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.707600 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:24 crc kubenswrapper[4717]: W0308 06:29:24.711062 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12d08ee_24ed_4838_8ef5_c5efb5546c06.slice/crio-9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed WatchSource:0}: Error finding container 9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed: Status 404 returned error can't find the container with id 9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed Mar 08 06:29:24 crc kubenswrapper[4717]: I0308 06:29:24.856580 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerStarted","Data":"9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed"} Mar 08 06:29:25 crc kubenswrapper[4717]: I0308 06:29:25.871101 4717 generic.go:334] "Generic (PLEG): container finished" podID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerID="a39e5c8ef7e8b19fbf120874dc439725aa350421bd306503ae4254b896bbf910" exitCode=0 Mar 08 06:29:25 crc kubenswrapper[4717]: I0308 06:29:25.871211 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerDied","Data":"a39e5c8ef7e8b19fbf120874dc439725aa350421bd306503ae4254b896bbf910"} Mar 08 06:29:26 crc kubenswrapper[4717]: I0308 06:29:26.886159 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerStarted","Data":"3f3a5a83552354f70ba6a5a0586b622003bbf76f56b59d03421d470ffac195ec"} Mar 08 06:29:32 crc kubenswrapper[4717]: I0308 06:29:32.961620 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerDied","Data":"3f3a5a83552354f70ba6a5a0586b622003bbf76f56b59d03421d470ffac195ec"} Mar 08 06:29:32 crc kubenswrapper[4717]: I0308 06:29:32.961640 4717 generic.go:334] "Generic (PLEG): container finished" podID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerID="3f3a5a83552354f70ba6a5a0586b622003bbf76f56b59d03421d470ffac195ec" exitCode=0 Mar 08 06:29:34 crc kubenswrapper[4717]: I0308 06:29:34.004034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerStarted","Data":"da33f37d6e80c2864f7c59712e5c945d61aa306e6e38d613d248f4c24ecd87e1"} Mar 08 06:29:34 crc kubenswrapper[4717]: I0308 06:29:34.235968 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:34 crc kubenswrapper[4717]: I0308 06:29:34.236023 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:34 crc kubenswrapper[4717]: I0308 06:29:34.783120 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:29:34 crc kubenswrapper[4717]: E0308 06:29:34.783469 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:29:35 crc kubenswrapper[4717]: I0308 06:29:35.284473 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pz6fx" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="registry-server" probeResult="failure" output=< Mar 08 06:29:35 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:29:35 crc kubenswrapper[4717]: > Mar 08 06:29:45 crc kubenswrapper[4717]: I0308 06:29:45.146756 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:45 crc kubenswrapper[4717]: I0308 06:29:45.165428 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pz6fx" podStartSLOduration=14.6542241 podStartE2EDuration="22.165410056s" podCreationTimestamp="2026-03-08 06:29:23 +0000 UTC" firstStartedPulling="2026-03-08 06:29:25.874901451 +0000 UTC m=+3792.792550325" lastFinishedPulling="2026-03-08 06:29:33.386087427 +0000 UTC m=+3800.303736281" observedRunningTime="2026-03-08 06:29:34.033015047 +0000 UTC m=+3800.950663891" watchObservedRunningTime="2026-03-08 06:29:45.165410056 +0000 UTC m=+3812.083058910" Mar 08 06:29:45 crc kubenswrapper[4717]: I0308 06:29:45.211433 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:45 crc kubenswrapper[4717]: I0308 06:29:45.380201 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:47 crc kubenswrapper[4717]: I0308 06:29:47.146010 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pz6fx" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="registry-server" containerID="cri-o://da33f37d6e80c2864f7c59712e5c945d61aa306e6e38d613d248f4c24ecd87e1" gracePeriod=2 Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.161846 4717 generic.go:334] "Generic (PLEG): container finished" podID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerID="da33f37d6e80c2864f7c59712e5c945d61aa306e6e38d613d248f4c24ecd87e1" exitCode=0 Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.161930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerDied","Data":"da33f37d6e80c2864f7c59712e5c945d61aa306e6e38d613d248f4c24ecd87e1"} Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.162207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz6fx" event={"ID":"d12d08ee-24ed-4838-8ef5-c5efb5546c06","Type":"ContainerDied","Data":"9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed"} Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.162223 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d79415d6353635b00720e0e503c34c6791cb18d38f60163bfcd55d27938d4ed" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.223322 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.237362 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities\") pod \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.237632 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtj2\" (UniqueName: \"kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2\") pod \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.237780 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content\") pod \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\" (UID: \"d12d08ee-24ed-4838-8ef5-c5efb5546c06\") " Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.238464 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities" (OuterVolumeSpecName: "utilities") pod "d12d08ee-24ed-4838-8ef5-c5efb5546c06" (UID: "d12d08ee-24ed-4838-8ef5-c5efb5546c06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.250730 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2" (OuterVolumeSpecName: "kube-api-access-9qtj2") pod "d12d08ee-24ed-4838-8ef5-c5efb5546c06" (UID: "d12d08ee-24ed-4838-8ef5-c5efb5546c06"). InnerVolumeSpecName "kube-api-access-9qtj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.343490 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtj2\" (UniqueName: \"kubernetes.io/projected/d12d08ee-24ed-4838-8ef5-c5efb5546c06-kube-api-access-9qtj2\") on node \"crc\" DevicePath \"\"" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.343790 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.362125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d12d08ee-24ed-4838-8ef5-c5efb5546c06" (UID: "d12d08ee-24ed-4838-8ef5-c5efb5546c06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:29:48 crc kubenswrapper[4717]: I0308 06:29:48.445522 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d12d08ee-24ed-4838-8ef5-c5efb5546c06-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:29:49 crc kubenswrapper[4717]: I0308 06:29:49.174644 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz6fx" Mar 08 06:29:49 crc kubenswrapper[4717]: I0308 06:29:49.232160 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:49 crc kubenswrapper[4717]: I0308 06:29:49.250543 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pz6fx"] Mar 08 06:29:49 crc kubenswrapper[4717]: I0308 06:29:49.782509 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:29:49 crc kubenswrapper[4717]: E0308 06:29:49.783083 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:29:49 crc kubenswrapper[4717]: I0308 06:29:49.808903 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" path="/var/lib/kubelet/pods/d12d08ee-24ed-4838-8ef5-c5efb5546c06/volumes" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.152935 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549190-cgwh4"] Mar 08 06:30:00 crc kubenswrapper[4717]: E0308 06:30:00.154028 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="registry-server" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.154044 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="registry-server" Mar 08 06:30:00 crc kubenswrapper[4717]: E0308 06:30:00.154073 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="extract-content" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.154081 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="extract-content" Mar 08 06:30:00 crc kubenswrapper[4717]: E0308 06:30:00.154118 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="extract-utilities" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.154129 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="extract-utilities" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.154405 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12d08ee-24ed-4838-8ef5-c5efb5546c06" containerName="registry-server" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.155240 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.157102 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.157105 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.158028 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.168428 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg"] Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.170367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.171956 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.172257 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.195110 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549190-cgwh4"] Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.203629 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg"] Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.307524 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.307592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtvss\" (UniqueName: \"kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss\") pod \"auto-csr-approver-29549190-cgwh4\" (UID: \"25a0d469-5b49-463a-9f68-b7dba1c00091\") " pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.307649 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdn8\" (UniqueName: \"kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.307780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.410023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.410098 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtvss\" (UniqueName: \"kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss\") pod \"auto-csr-approver-29549190-cgwh4\" (UID: \"25a0d469-5b49-463a-9f68-b7dba1c00091\") " pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.410203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdn8\" (UniqueName: \"kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.410426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.410983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.426242 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.432037 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdn8\" (UniqueName: \"kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8\") pod \"collect-profiles-29549190-rg7zg\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.432635 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtvss\" (UniqueName: \"kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss\") pod \"auto-csr-approver-29549190-cgwh4\" (UID: \"25a0d469-5b49-463a-9f68-b7dba1c00091\") " pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.472057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.486380 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.782544 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:30:00 crc kubenswrapper[4717]: E0308 06:30:00.783081 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:30:00 crc kubenswrapper[4717]: I0308 06:30:00.991079 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549190-cgwh4"] Mar 08 06:30:01 crc kubenswrapper[4717]: I0308 06:30:01.005948 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg"] Mar 08 06:30:01 crc kubenswrapper[4717]: I0308 06:30:01.301980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" event={"ID":"25a0d469-5b49-463a-9f68-b7dba1c00091","Type":"ContainerStarted","Data":"5d560275e750e9193a3a1fbe825a389f45d0e69261bc0946d60e83332bfa9c76"} Mar 08 06:30:01 crc kubenswrapper[4717]: I0308 06:30:01.304962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" event={"ID":"d25fd984-997f-4aa9-8d01-a6acd96b1841","Type":"ContainerStarted","Data":"c1577b0f49da40debfe5e950da589752093e423f4e29f292c1f1e867d2aa0aae"} Mar 08 06:30:01 crc kubenswrapper[4717]: I0308 06:30:01.306115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" event={"ID":"d25fd984-997f-4aa9-8d01-a6acd96b1841","Type":"ContainerStarted","Data":"c8e58dd8a423ce59b6784a5ae3fcada248af2cff50c34fec0bd7ef1562901d13"} Mar 08 06:30:01 crc kubenswrapper[4717]: I0308 06:30:01.350990 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" podStartSLOduration=1.350966667 podStartE2EDuration="1.350966667s" podCreationTimestamp="2026-03-08 06:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 06:30:01.345453031 +0000 UTC m=+3828.263101915" watchObservedRunningTime="2026-03-08 06:30:01.350966667 +0000 UTC m=+3828.268615531" Mar 08 06:30:02 crc kubenswrapper[4717]: I0308 06:30:02.319807 4717 generic.go:334] "Generic (PLEG): container finished" podID="d25fd984-997f-4aa9-8d01-a6acd96b1841" containerID="c1577b0f49da40debfe5e950da589752093e423f4e29f292c1f1e867d2aa0aae" exitCode=0 Mar 08 06:30:02 crc kubenswrapper[4717]: I0308 06:30:02.319928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" event={"ID":"d25fd984-997f-4aa9-8d01-a6acd96b1841","Type":"ContainerDied","Data":"c1577b0f49da40debfe5e950da589752093e423f4e29f292c1f1e867d2aa0aae"} Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.343517 4717 generic.go:334] "Generic (PLEG): container finished" podID="25a0d469-5b49-463a-9f68-b7dba1c00091" containerID="2b970b93d4eecb4a59e160db147234496e52c60c2b59b66fd7b24150721a5865" exitCode=0 Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.343727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" event={"ID":"25a0d469-5b49-463a-9f68-b7dba1c00091","Type":"ContainerDied","Data":"2b970b93d4eecb4a59e160db147234496e52c60c2b59b66fd7b24150721a5865"} Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.806305 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.914963 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmdn8\" (UniqueName: \"kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8\") pod \"d25fd984-997f-4aa9-8d01-a6acd96b1841\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.915057 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume\") pod \"d25fd984-997f-4aa9-8d01-a6acd96b1841\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.915080 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume\") pod \"d25fd984-997f-4aa9-8d01-a6acd96b1841\" (UID: \"d25fd984-997f-4aa9-8d01-a6acd96b1841\") " Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.916427 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume" (OuterVolumeSpecName: "config-volume") pod "d25fd984-997f-4aa9-8d01-a6acd96b1841" (UID: "d25fd984-997f-4aa9-8d01-a6acd96b1841"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.924653 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d25fd984-997f-4aa9-8d01-a6acd96b1841" (UID: "d25fd984-997f-4aa9-8d01-a6acd96b1841"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:30:03 crc kubenswrapper[4717]: I0308 06:30:03.936581 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8" (OuterVolumeSpecName: "kube-api-access-pmdn8") pod "d25fd984-997f-4aa9-8d01-a6acd96b1841" (UID: "d25fd984-997f-4aa9-8d01-a6acd96b1841"). InnerVolumeSpecName "kube-api-access-pmdn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.017903 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmdn8\" (UniqueName: \"kubernetes.io/projected/d25fd984-997f-4aa9-8d01-a6acd96b1841-kube-api-access-pmdn8\") on node \"crc\" DevicePath \"\"" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.017933 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d25fd984-997f-4aa9-8d01-a6acd96b1841-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.017943 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25fd984-997f-4aa9-8d01-a6acd96b1841-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.354170 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.357887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg" event={"ID":"d25fd984-997f-4aa9-8d01-a6acd96b1841","Type":"ContainerDied","Data":"c8e58dd8a423ce59b6784a5ae3fcada248af2cff50c34fec0bd7ef1562901d13"} Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.357938 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e58dd8a423ce59b6784a5ae3fcada248af2cff50c34fec0bd7ef1562901d13" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.414672 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz"] Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.422253 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549145-xlqmz"] Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.657906 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.834514 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtvss\" (UniqueName: \"kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss\") pod \"25a0d469-5b49-463a-9f68-b7dba1c00091\" (UID: \"25a0d469-5b49-463a-9f68-b7dba1c00091\") " Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.840222 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss" (OuterVolumeSpecName: "kube-api-access-rtvss") pod "25a0d469-5b49-463a-9f68-b7dba1c00091" (UID: "25a0d469-5b49-463a-9f68-b7dba1c00091"). InnerVolumeSpecName "kube-api-access-rtvss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:30:04 crc kubenswrapper[4717]: I0308 06:30:04.937700 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtvss\" (UniqueName: \"kubernetes.io/projected/25a0d469-5b49-463a-9f68-b7dba1c00091-kube-api-access-rtvss\") on node \"crc\" DevicePath \"\"" Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.372509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" event={"ID":"25a0d469-5b49-463a-9f68-b7dba1c00091","Type":"ContainerDied","Data":"5d560275e750e9193a3a1fbe825a389f45d0e69261bc0946d60e83332bfa9c76"} Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.372572 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549190-cgwh4" Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.372578 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d560275e750e9193a3a1fbe825a389f45d0e69261bc0946d60e83332bfa9c76" Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.734743 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549184-fbdcm"] Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.746334 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549184-fbdcm"] Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.818426 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1718a93f-8cc0-4b53-8864-5c1d6f56b849" path="/var/lib/kubelet/pods/1718a93f-8cc0-4b53-8864-5c1d6f56b849/volumes" Mar 08 06:30:05 crc kubenswrapper[4717]: I0308 06:30:05.822770 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d730563b-5163-4a57-bc85-85d59e25a6ed" path="/var/lib/kubelet/pods/d730563b-5163-4a57-bc85-85d59e25a6ed/volumes" Mar 08 06:30:11 crc kubenswrapper[4717]: I0308 06:30:11.782870 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:30:11 crc kubenswrapper[4717]: E0308 06:30:11.783856 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:30:22 crc kubenswrapper[4717]: I0308 06:30:22.782411 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:30:22 crc kubenswrapper[4717]: E0308 06:30:22.783827 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:30:37 crc kubenswrapper[4717]: I0308 06:30:37.782822 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:30:37 crc kubenswrapper[4717]: E0308 06:30:37.784145 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:30:38 crc kubenswrapper[4717]: I0308 06:30:38.187520 4717 scope.go:117] "RemoveContainer" containerID="59d84a015c00b07037c34968e6fe0cf0d1ffc7a8b0604c9e8fd450737d454719" Mar 08 06:30:38 crc kubenswrapper[4717]: I0308 06:30:38.236936 4717 scope.go:117] "RemoveContainer" containerID="8bb44e2cd61646c483ba46e5c007bee41625b60eb68211d01d5e4f47c260d0f2" Mar 08 06:30:52 crc kubenswrapper[4717]: I0308 06:30:52.782578 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:30:52 crc kubenswrapper[4717]: E0308 06:30:52.784179 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:07 crc kubenswrapper[4717]: I0308 06:31:07.782779 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:31:07 crc kubenswrapper[4717]: E0308 06:31:07.783883 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:18 crc kubenswrapper[4717]: I0308 06:31:18.781831 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:31:18 crc kubenswrapper[4717]: E0308 06:31:18.782968 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:29 crc kubenswrapper[4717]: I0308 06:31:29.782574 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:31:29 crc kubenswrapper[4717]: E0308 06:31:29.783637 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:41 crc kubenswrapper[4717]: I0308 06:31:41.782521 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:31:41 crc kubenswrapper[4717]: E0308 06:31:41.783748 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.958221 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:44 crc kubenswrapper[4717]: E0308 06:31:44.960078 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a0d469-5b49-463a-9f68-b7dba1c00091" containerName="oc" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.960122 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a0d469-5b49-463a-9f68-b7dba1c00091" containerName="oc" Mar 08 06:31:44 crc kubenswrapper[4717]: E0308 06:31:44.960157 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25fd984-997f-4aa9-8d01-a6acd96b1841" containerName="collect-profiles" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.960167 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25fd984-997f-4aa9-8d01-a6acd96b1841" containerName="collect-profiles" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.960990 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25fd984-997f-4aa9-8d01-a6acd96b1841" containerName="collect-profiles" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.961021 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a0d469-5b49-463a-9f68-b7dba1c00091" containerName="oc" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.973051 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:44 crc kubenswrapper[4717]: I0308 06:31:44.974019 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.055385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.055818 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28fqh\" (UniqueName: \"kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.055962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.158733 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.158957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28fqh\" (UniqueName: \"kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.159041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.159712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.159752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.202256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28fqh\" (UniqueName: \"kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh\") pod \"redhat-marketplace-q99vh\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.299503 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:45 crc kubenswrapper[4717]: I0308 06:31:45.869723 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:46 crc kubenswrapper[4717]: I0308 06:31:46.526910 4717 generic.go:334] "Generic (PLEG): container finished" podID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerID="8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9" exitCode=0 Mar 08 06:31:46 crc kubenswrapper[4717]: I0308 06:31:46.526997 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerDied","Data":"8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9"} Mar 08 06:31:46 crc kubenswrapper[4717]: I0308 06:31:46.527329 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerStarted","Data":"d112c5fcdd652ce0581377081b31d0bbe5208980b71bf90b87fe433c59f6c496"} Mar 08 06:31:47 crc kubenswrapper[4717]: I0308 06:31:47.543924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerStarted","Data":"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8"} Mar 08 06:31:48 crc kubenswrapper[4717]: I0308 06:31:48.559906 4717 generic.go:334] "Generic (PLEG): container finished" podID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerID="97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8" exitCode=0 Mar 08 06:31:48 crc kubenswrapper[4717]: I0308 06:31:48.559964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerDied","Data":"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8"} Mar 08 06:31:49 crc kubenswrapper[4717]: I0308 06:31:49.573258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerStarted","Data":"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c"} Mar 08 06:31:49 crc kubenswrapper[4717]: I0308 06:31:49.599010 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q99vh" podStartSLOduration=3.088797209 podStartE2EDuration="5.598992853s" podCreationTimestamp="2026-03-08 06:31:44 +0000 UTC" firstStartedPulling="2026-03-08 06:31:46.530126217 +0000 UTC m=+3933.447775101" lastFinishedPulling="2026-03-08 06:31:49.040321861 +0000 UTC m=+3935.957970745" observedRunningTime="2026-03-08 06:31:49.594088322 +0000 UTC m=+3936.511737166" watchObservedRunningTime="2026-03-08 06:31:49.598992853 +0000 UTC m=+3936.516641707" Mar 08 06:31:55 crc kubenswrapper[4717]: I0308 06:31:55.299845 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:55 crc kubenswrapper[4717]: I0308 06:31:55.300585 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:55 crc kubenswrapper[4717]: I0308 06:31:55.375731 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:55 crc kubenswrapper[4717]: I0308 06:31:55.741617 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:55 crc kubenswrapper[4717]: I0308 06:31:55.803318 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:56 crc kubenswrapper[4717]: I0308 06:31:56.783307 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:31:56 crc kubenswrapper[4717]: E0308 06:31:56.783853 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:31:57 crc kubenswrapper[4717]: I0308 06:31:57.688724 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q99vh" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="registry-server" containerID="cri-o://52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c" gracePeriod=2 Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.279265 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.373942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28fqh\" (UniqueName: \"kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh\") pod \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.374013 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities\") pod \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.374132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content\") pod \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\" (UID: \"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57\") " Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.374978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities" (OuterVolumeSpecName: "utilities") pod "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" (UID: "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.393988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh" (OuterVolumeSpecName: "kube-api-access-28fqh") pod "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" (UID: "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57"). InnerVolumeSpecName "kube-api-access-28fqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.412239 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" (UID: "8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.475633 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.475665 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28fqh\" (UniqueName: \"kubernetes.io/projected/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-kube-api-access-28fqh\") on node \"crc\" DevicePath \"\"" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.475677 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.707262 4717 generic.go:334] "Generic (PLEG): container finished" podID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerID="52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c" exitCode=0 Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.707346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerDied","Data":"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c"} Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.707398 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q99vh" event={"ID":"8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57","Type":"ContainerDied","Data":"d112c5fcdd652ce0581377081b31d0bbe5208980b71bf90b87fe433c59f6c496"} Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.707833 4717 scope.go:117] "RemoveContainer" containerID="52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.707886 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q99vh" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.759338 4717 scope.go:117] "RemoveContainer" containerID="97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.771803 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.786589 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q99vh"] Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.791120 4717 scope.go:117] "RemoveContainer" containerID="8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.862586 4717 scope.go:117] "RemoveContainer" containerID="52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c" Mar 08 06:31:58 crc kubenswrapper[4717]: E0308 06:31:58.863200 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c\": container with ID starting with 52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c not found: ID does not exist" containerID="52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.863242 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c"} err="failed to get container status \"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c\": rpc error: code = NotFound desc = could not find container \"52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c\": container with ID starting with 52322f2f5de03eea0c1824e186cd3ea7b0596d48c1d773968e93a4db2a8e993c not found: ID does not exist" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.863266 4717 scope.go:117] "RemoveContainer" containerID="97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8" Mar 08 06:31:58 crc kubenswrapper[4717]: E0308 06:31:58.863882 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8\": container with ID starting with 97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8 not found: ID does not exist" containerID="97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.863909 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8"} err="failed to get container status \"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8\": rpc error: code = NotFound desc = could not find container \"97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8\": container with ID starting with 97b23e0c2242e7da2e33b99c84d6c0de8cfaf7ca6bee823292ea614c569fb7e8 not found: ID does not exist" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.863926 4717 scope.go:117] "RemoveContainer" containerID="8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9" Mar 08 06:31:58 crc kubenswrapper[4717]: E0308 06:31:58.864332 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9\": container with ID starting with 8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9 not found: ID does not exist" containerID="8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9" Mar 08 06:31:58 crc kubenswrapper[4717]: I0308 06:31:58.864361 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9"} err="failed to get container status \"8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9\": rpc error: code = NotFound desc = could not find container \"8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9\": container with ID starting with 8b5ad01da7c08d4489a00ab45fe2f5bdf024eb965c052b52b46ddcbffeb33df9 not found: ID does not exist" Mar 08 06:31:59 crc kubenswrapper[4717]: I0308 06:31:59.804827 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" path="/var/lib/kubelet/pods/8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57/volumes" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.163071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549192-shjql"] Mar 08 06:32:00 crc kubenswrapper[4717]: E0308 06:32:00.163998 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="extract-content" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.164024 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="extract-content" Mar 08 06:32:00 crc kubenswrapper[4717]: E0308 06:32:00.164055 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="registry-server" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.164069 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="registry-server" Mar 08 06:32:00 crc kubenswrapper[4717]: E0308 06:32:00.164113 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="extract-utilities" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.164126 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="extract-utilities" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.164553 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7bfaa0-eedd-4a5c-9ed5-eec7dca01d57" containerName="registry-server" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.165851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.168817 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.168822 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.169379 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.177921 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549192-shjql"] Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.208731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbcc8\" (UniqueName: \"kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8\") pod \"auto-csr-approver-29549192-shjql\" (UID: \"ca011fc0-86f8-4a30-b556-bb853740acab\") " pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.311734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbcc8\" (UniqueName: \"kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8\") pod \"auto-csr-approver-29549192-shjql\" (UID: \"ca011fc0-86f8-4a30-b556-bb853740acab\") " pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.362433 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbcc8\" (UniqueName: \"kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8\") pod \"auto-csr-approver-29549192-shjql\" (UID: \"ca011fc0-86f8-4a30-b556-bb853740acab\") " pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:00 crc kubenswrapper[4717]: I0308 06:32:00.508148 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:01 crc kubenswrapper[4717]: I0308 06:32:01.074080 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549192-shjql"] Mar 08 06:32:01 crc kubenswrapper[4717]: I0308 06:32:01.751114 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549192-shjql" event={"ID":"ca011fc0-86f8-4a30-b556-bb853740acab","Type":"ContainerStarted","Data":"5e4de105a51cd9c4427128fac50b463f05c734ff349e61e3c7584cb96e352d39"} Mar 08 06:32:02 crc kubenswrapper[4717]: I0308 06:32:02.764097 4717 generic.go:334] "Generic (PLEG): container finished" podID="ca011fc0-86f8-4a30-b556-bb853740acab" containerID="48899c95f83c04e32d18cb58f07477aa1651aa838367d7099b1d207740ecb2de" exitCode=0 Mar 08 06:32:02 crc kubenswrapper[4717]: I0308 06:32:02.764226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549192-shjql" event={"ID":"ca011fc0-86f8-4a30-b556-bb853740acab","Type":"ContainerDied","Data":"48899c95f83c04e32d18cb58f07477aa1651aa838367d7099b1d207740ecb2de"} Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.208363 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.216903 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbcc8\" (UniqueName: \"kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8\") pod \"ca011fc0-86f8-4a30-b556-bb853740acab\" (UID: \"ca011fc0-86f8-4a30-b556-bb853740acab\") " Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.223964 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8" (OuterVolumeSpecName: "kube-api-access-zbcc8") pod "ca011fc0-86f8-4a30-b556-bb853740acab" (UID: "ca011fc0-86f8-4a30-b556-bb853740acab"). InnerVolumeSpecName "kube-api-access-zbcc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.318219 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbcc8\" (UniqueName: \"kubernetes.io/projected/ca011fc0-86f8-4a30-b556-bb853740acab-kube-api-access-zbcc8\") on node \"crc\" DevicePath \"\"" Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.797499 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549192-shjql" event={"ID":"ca011fc0-86f8-4a30-b556-bb853740acab","Type":"ContainerDied","Data":"5e4de105a51cd9c4427128fac50b463f05c734ff349e61e3c7584cb96e352d39"} Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.797544 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4de105a51cd9c4427128fac50b463f05c734ff349e61e3c7584cb96e352d39" Mar 08 06:32:04 crc kubenswrapper[4717]: I0308 06:32:04.797590 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549192-shjql" Mar 08 06:32:05 crc kubenswrapper[4717]: I0308 06:32:05.310420 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549186-7rn67"] Mar 08 06:32:05 crc kubenswrapper[4717]: I0308 06:32:05.322646 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549186-7rn67"] Mar 08 06:32:05 crc kubenswrapper[4717]: I0308 06:32:05.804791 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2aa1a72-2ad2-4384-bd9d-a078e04c03ca" path="/var/lib/kubelet/pods/a2aa1a72-2ad2-4384-bd9d-a078e04c03ca/volumes" Mar 08 06:32:07 crc kubenswrapper[4717]: I0308 06:32:07.782498 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:32:07 crc kubenswrapper[4717]: E0308 06:32:07.783447 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:32:22 crc kubenswrapper[4717]: I0308 06:32:22.782473 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:32:22 crc kubenswrapper[4717]: E0308 06:32:22.783799 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:32:33 crc kubenswrapper[4717]: I0308 06:32:33.795925 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:32:33 crc kubenswrapper[4717]: E0308 06:32:33.797161 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:32:38 crc kubenswrapper[4717]: I0308 06:32:38.373607 4717 scope.go:117] "RemoveContainer" containerID="387a5454d70eb7d06feb115829ee34d0fa220cee3d8bc4b4ee1c011f4101bd1a" Mar 08 06:32:47 crc kubenswrapper[4717]: I0308 06:32:47.782162 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:32:48 crc kubenswrapper[4717]: I0308 06:32:48.341626 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9"} Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.155590 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549194-bw44x"] Mar 08 06:34:00 crc kubenswrapper[4717]: E0308 06:34:00.156895 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca011fc0-86f8-4a30-b556-bb853740acab" containerName="oc" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.156918 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca011fc0-86f8-4a30-b556-bb853740acab" containerName="oc" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.157234 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca011fc0-86f8-4a30-b556-bb853740acab" containerName="oc" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.158267 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.161167 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.161794 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.163553 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.194130 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549194-bw44x"] Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.206925 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsg7b\" (UniqueName: \"kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b\") pod \"auto-csr-approver-29549194-bw44x\" (UID: \"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3\") " pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.308357 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsg7b\" (UniqueName: \"kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b\") pod \"auto-csr-approver-29549194-bw44x\" (UID: \"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3\") " pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.334647 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsg7b\" (UniqueName: \"kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b\") pod \"auto-csr-approver-29549194-bw44x\" (UID: \"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3\") " pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:00 crc kubenswrapper[4717]: I0308 06:34:00.484906 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:01 crc kubenswrapper[4717]: I0308 06:34:01.005120 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549194-bw44x"] Mar 08 06:34:01 crc kubenswrapper[4717]: I0308 06:34:01.017621 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:34:01 crc kubenswrapper[4717]: I0308 06:34:01.157874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549194-bw44x" event={"ID":"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3","Type":"ContainerStarted","Data":"54b441141e88bf2f5ec89b6212a0775fb168ace234182b571aa8b988c0799c1c"} Mar 08 06:34:03 crc kubenswrapper[4717]: I0308 06:34:03.188466 4717 generic.go:334] "Generic (PLEG): container finished" podID="c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" containerID="38d7f7632f4928a26cdbfbcf48e6fe99fbad6712dfd8cbf49f7c3f6d7c9840fd" exitCode=0 Mar 08 06:34:03 crc kubenswrapper[4717]: I0308 06:34:03.188539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549194-bw44x" event={"ID":"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3","Type":"ContainerDied","Data":"38d7f7632f4928a26cdbfbcf48e6fe99fbad6712dfd8cbf49f7c3f6d7c9840fd"} Mar 08 06:34:04 crc kubenswrapper[4717]: I0308 06:34:04.837713 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:04 crc kubenswrapper[4717]: I0308 06:34:04.910331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsg7b\" (UniqueName: \"kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b\") pod \"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3\" (UID: \"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3\") " Mar 08 06:34:04 crc kubenswrapper[4717]: I0308 06:34:04.918790 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b" (OuterVolumeSpecName: "kube-api-access-jsg7b") pod "c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" (UID: "c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3"). InnerVolumeSpecName "kube-api-access-jsg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.013275 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsg7b\" (UniqueName: \"kubernetes.io/projected/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3-kube-api-access-jsg7b\") on node \"crc\" DevicePath \"\"" Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.214893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549194-bw44x" event={"ID":"c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3","Type":"ContainerDied","Data":"54b441141e88bf2f5ec89b6212a0775fb168ace234182b571aa8b988c0799c1c"} Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.215350 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b441141e88bf2f5ec89b6212a0775fb168ace234182b571aa8b988c0799c1c" Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.214971 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549194-bw44x" Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.942964 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549188-4pv79"] Mar 08 06:34:05 crc kubenswrapper[4717]: I0308 06:34:05.954999 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549188-4pv79"] Mar 08 06:34:07 crc kubenswrapper[4717]: I0308 06:34:07.794278 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad59f4c-b152-4455-a5da-055f2bb76b40" path="/var/lib/kubelet/pods/4ad59f4c-b152-4455-a5da-055f2bb76b40/volumes" Mar 08 06:34:38 crc kubenswrapper[4717]: I0308 06:34:38.515620 4717 scope.go:117] "RemoveContainer" containerID="333402b52ff8e4f00291caf2d9639aef7c526f292a14c5872d63e9bc89b2989e" Mar 08 06:35:04 crc kubenswrapper[4717]: I0308 06:35:04.120543 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:35:04 crc kubenswrapper[4717]: I0308 06:35:04.121092 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:35:34 crc kubenswrapper[4717]: I0308 06:35:34.119827 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:35:34 crc kubenswrapper[4717]: I0308 06:35:34.121127 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:35:38 crc kubenswrapper[4717]: I0308 06:35:38.886934 4717 scope.go:117] "RemoveContainer" containerID="3f3a5a83552354f70ba6a5a0586b622003bbf76f56b59d03421d470ffac195ec" Mar 08 06:35:38 crc kubenswrapper[4717]: I0308 06:35:38.911422 4717 scope.go:117] "RemoveContainer" containerID="a39e5c8ef7e8b19fbf120874dc439725aa350421bd306503ae4254b896bbf910" Mar 08 06:35:38 crc kubenswrapper[4717]: I0308 06:35:38.970459 4717 scope.go:117] "RemoveContainer" containerID="da33f37d6e80c2864f7c59712e5c945d61aa306e6e38d613d248f4c24ecd87e1" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.178156 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:35:52 crc kubenswrapper[4717]: E0308 06:35:52.179138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" containerName="oc" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.179153 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" containerName="oc" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.179384 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" containerName="oc" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.181124 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.209008 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.336913 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.337104 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.337241 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6qk\" (UniqueName: \"kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.439222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.439368 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.439548 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6qk\" (UniqueName: \"kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.439894 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.440202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.469662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6qk\" (UniqueName: \"kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk\") pod \"community-operators-78j2j\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:52 crc kubenswrapper[4717]: I0308 06:35:52.518350 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:35:53 crc kubenswrapper[4717]: I0308 06:35:53.084858 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:35:53 crc kubenswrapper[4717]: I0308 06:35:53.517092 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerStarted","Data":"cb502544e9dc4ce788074861e08f9626d6971055a6c5c793cb0bfda61f1b8178"} Mar 08 06:35:53 crc kubenswrapper[4717]: I0308 06:35:53.517413 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerStarted","Data":"e57b01dff205b66781f4d776a356d5a0b8118c8f978d3588278160fd59e2c47a"} Mar 08 06:35:54 crc kubenswrapper[4717]: I0308 06:35:54.532677 4717 generic.go:334] "Generic (PLEG): container finished" podID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerID="cb502544e9dc4ce788074861e08f9626d6971055a6c5c793cb0bfda61f1b8178" exitCode=0 Mar 08 06:35:54 crc kubenswrapper[4717]: I0308 06:35:54.532829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerDied","Data":"cb502544e9dc4ce788074861e08f9626d6971055a6c5c793cb0bfda61f1b8178"} Mar 08 06:35:55 crc kubenswrapper[4717]: I0308 06:35:55.544613 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerStarted","Data":"08a631b625961b57de50f450ab5fb4540eacc63954f1ff9a53452a508a835f19"} Mar 08 06:35:56 crc kubenswrapper[4717]: I0308 06:35:56.564185 4717 generic.go:334] "Generic (PLEG): container finished" podID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerID="08a631b625961b57de50f450ab5fb4540eacc63954f1ff9a53452a508a835f19" exitCode=0 Mar 08 06:35:56 crc kubenswrapper[4717]: I0308 06:35:56.564260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerDied","Data":"08a631b625961b57de50f450ab5fb4540eacc63954f1ff9a53452a508a835f19"} Mar 08 06:35:58 crc kubenswrapper[4717]: I0308 06:35:58.594043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerStarted","Data":"38fda5ded05ac59d87433af80cd64b18d791131cd61bb645ca239b700ab93dee"} Mar 08 06:35:58 crc kubenswrapper[4717]: I0308 06:35:58.639438 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78j2j" podStartSLOduration=2.052511811 podStartE2EDuration="6.639388802s" podCreationTimestamp="2026-03-08 06:35:52 +0000 UTC" firstStartedPulling="2026-03-08 06:35:53.520623918 +0000 UTC m=+4180.438272792" lastFinishedPulling="2026-03-08 06:35:58.107500899 +0000 UTC m=+4185.025149783" observedRunningTime="2026-03-08 06:35:58.615447644 +0000 UTC m=+4185.533096518" watchObservedRunningTime="2026-03-08 06:35:58.639388802 +0000 UTC m=+4185.557037656" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.153487 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549196-dgglx"] Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.155158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.158145 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.159287 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.160044 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.182410 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549196-dgglx"] Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.203953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msqd\" (UniqueName: \"kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd\") pod \"auto-csr-approver-29549196-dgglx\" (UID: \"94cad26e-953d-4480-a4e2-29568afd1e00\") " pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.306990 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msqd\" (UniqueName: \"kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd\") pod \"auto-csr-approver-29549196-dgglx\" (UID: \"94cad26e-953d-4480-a4e2-29568afd1e00\") " pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.584762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msqd\" (UniqueName: \"kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd\") pod \"auto-csr-approver-29549196-dgglx\" (UID: \"94cad26e-953d-4480-a4e2-29568afd1e00\") " pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:00 crc kubenswrapper[4717]: I0308 06:36:00.780057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:01 crc kubenswrapper[4717]: I0308 06:36:01.357057 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549196-dgglx"] Mar 08 06:36:01 crc kubenswrapper[4717]: I0308 06:36:01.646928 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549196-dgglx" event={"ID":"94cad26e-953d-4480-a4e2-29568afd1e00","Type":"ContainerStarted","Data":"63309183556ee9b937e0aaeeb945ab5efedf0423a2f95dd6373bf985608deab2"} Mar 08 06:36:02 crc kubenswrapper[4717]: I0308 06:36:02.519397 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:02 crc kubenswrapper[4717]: I0308 06:36:02.519686 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:02 crc kubenswrapper[4717]: I0308 06:36:02.581217 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:03 crc kubenswrapper[4717]: I0308 06:36:03.672004 4717 generic.go:334] "Generic (PLEG): container finished" podID="94cad26e-953d-4480-a4e2-29568afd1e00" containerID="869d3f8162ac35fa33c6c280b9a9c6ea1b4d1e1a76f43020c5afe7ed122b80a2" exitCode=0 Mar 08 06:36:03 crc kubenswrapper[4717]: I0308 06:36:03.672135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549196-dgglx" event={"ID":"94cad26e-953d-4480-a4e2-29568afd1e00","Type":"ContainerDied","Data":"869d3f8162ac35fa33c6c280b9a9c6ea1b4d1e1a76f43020c5afe7ed122b80a2"} Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.120085 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.120391 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.120431 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.121184 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.121244 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9" gracePeriod=600 Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.688660 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9" exitCode=0 Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.688738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9"} Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.688829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce"} Mar 08 06:36:04 crc kubenswrapper[4717]: I0308 06:36:04.688867 4717 scope.go:117] "RemoveContainer" containerID="0a69e82cee9c20ada6197fb2b06bda31d4be8947a935cc5f26a801da3dcfa3a1" Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.108185 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.157733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msqd\" (UniqueName: \"kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd\") pod \"94cad26e-953d-4480-a4e2-29568afd1e00\" (UID: \"94cad26e-953d-4480-a4e2-29568afd1e00\") " Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.169966 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd" (OuterVolumeSpecName: "kube-api-access-5msqd") pod "94cad26e-953d-4480-a4e2-29568afd1e00" (UID: "94cad26e-953d-4480-a4e2-29568afd1e00"). InnerVolumeSpecName "kube-api-access-5msqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.259884 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msqd\" (UniqueName: \"kubernetes.io/projected/94cad26e-953d-4480-a4e2-29568afd1e00-kube-api-access-5msqd\") on node \"crc\" DevicePath \"\"" Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.721113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549196-dgglx" event={"ID":"94cad26e-953d-4480-a4e2-29568afd1e00","Type":"ContainerDied","Data":"63309183556ee9b937e0aaeeb945ab5efedf0423a2f95dd6373bf985608deab2"} Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.721186 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63309183556ee9b937e0aaeeb945ab5efedf0423a2f95dd6373bf985608deab2" Mar 08 06:36:05 crc kubenswrapper[4717]: I0308 06:36:05.721265 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549196-dgglx" Mar 08 06:36:06 crc kubenswrapper[4717]: I0308 06:36:06.213124 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549190-cgwh4"] Mar 08 06:36:06 crc kubenswrapper[4717]: I0308 06:36:06.229753 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549190-cgwh4"] Mar 08 06:36:07 crc kubenswrapper[4717]: I0308 06:36:07.801611 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a0d469-5b49-463a-9f68-b7dba1c00091" path="/var/lib/kubelet/pods/25a0d469-5b49-463a-9f68-b7dba1c00091/volumes" Mar 08 06:36:12 crc kubenswrapper[4717]: I0308 06:36:12.850243 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:12 crc kubenswrapper[4717]: I0308 06:36:12.925221 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:36:13 crc kubenswrapper[4717]: I0308 06:36:13.837557 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78j2j" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="registry-server" containerID="cri-o://38fda5ded05ac59d87433af80cd64b18d791131cd61bb645ca239b700ab93dee" gracePeriod=2 Mar 08 06:36:14 crc kubenswrapper[4717]: I0308 06:36:14.856978 4717 generic.go:334] "Generic (PLEG): container finished" podID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerID="38fda5ded05ac59d87433af80cd64b18d791131cd61bb645ca239b700ab93dee" exitCode=0 Mar 08 06:36:14 crc kubenswrapper[4717]: I0308 06:36:14.857607 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerDied","Data":"38fda5ded05ac59d87433af80cd64b18d791131cd61bb645ca239b700ab93dee"} Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.010297 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.207130 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content\") pod \"7379e714-d0c4-42b9-9f4a-d2b246816b39\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.207392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities\") pod \"7379e714-d0c4-42b9-9f4a-d2b246816b39\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.207490 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6qk\" (UniqueName: \"kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk\") pod \"7379e714-d0c4-42b9-9f4a-d2b246816b39\" (UID: \"7379e714-d0c4-42b9-9f4a-d2b246816b39\") " Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.208251 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities" (OuterVolumeSpecName: "utilities") pod "7379e714-d0c4-42b9-9f4a-d2b246816b39" (UID: "7379e714-d0c4-42b9-9f4a-d2b246816b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.213803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk" (OuterVolumeSpecName: "kube-api-access-5j6qk") pod "7379e714-d0c4-42b9-9f4a-d2b246816b39" (UID: "7379e714-d0c4-42b9-9f4a-d2b246816b39"). InnerVolumeSpecName "kube-api-access-5j6qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.266371 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7379e714-d0c4-42b9-9f4a-d2b246816b39" (UID: "7379e714-d0c4-42b9-9f4a-d2b246816b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.310037 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6qk\" (UniqueName: \"kubernetes.io/projected/7379e714-d0c4-42b9-9f4a-d2b246816b39-kube-api-access-5j6qk\") on node \"crc\" DevicePath \"\"" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.310903 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.310920 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7379e714-d0c4-42b9-9f4a-d2b246816b39-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.883924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78j2j" event={"ID":"7379e714-d0c4-42b9-9f4a-d2b246816b39","Type":"ContainerDied","Data":"e57b01dff205b66781f4d776a356d5a0b8118c8f978d3588278160fd59e2c47a"} Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.883988 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78j2j" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.884336 4717 scope.go:117] "RemoveContainer" containerID="38fda5ded05ac59d87433af80cd64b18d791131cd61bb645ca239b700ab93dee" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.936286 4717 scope.go:117] "RemoveContainer" containerID="08a631b625961b57de50f450ab5fb4540eacc63954f1ff9a53452a508a835f19" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.950938 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.983431 4717 scope.go:117] "RemoveContainer" containerID="cb502544e9dc4ce788074861e08f9626d6971055a6c5c793cb0bfda61f1b8178" Mar 08 06:36:16 crc kubenswrapper[4717]: I0308 06:36:16.999121 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78j2j"] Mar 08 06:36:17 crc kubenswrapper[4717]: E0308 06:36:17.128924 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7379e714_d0c4_42b9_9f4a_d2b246816b39.slice/crio-e57b01dff205b66781f4d776a356d5a0b8118c8f978d3588278160fd59e2c47a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7379e714_d0c4_42b9_9f4a_d2b246816b39.slice\": RecentStats: unable to find data in memory cache]" Mar 08 06:36:17 crc kubenswrapper[4717]: I0308 06:36:17.797894 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" path="/var/lib/kubelet/pods/7379e714-d0c4-42b9-9f4a-d2b246816b39/volumes" Mar 08 06:36:39 crc kubenswrapper[4717]: I0308 06:36:39.067831 4717 scope.go:117] "RemoveContainer" containerID="2b970b93d4eecb4a59e160db147234496e52c60c2b59b66fd7b24150721a5865" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.779174 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:20 crc kubenswrapper[4717]: E0308 06:37:20.780395 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="extract-content" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780415 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="extract-content" Mar 08 06:37:20 crc kubenswrapper[4717]: E0308 06:37:20.780431 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="extract-utilities" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780442 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="extract-utilities" Mar 08 06:37:20 crc kubenswrapper[4717]: E0308 06:37:20.780477 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cad26e-953d-4480-a4e2-29568afd1e00" containerName="oc" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780488 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cad26e-953d-4480-a4e2-29568afd1e00" containerName="oc" Mar 08 06:37:20 crc kubenswrapper[4717]: E0308 06:37:20.780526 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="registry-server" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780537 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="registry-server" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780811 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cad26e-953d-4480-a4e2-29568afd1e00" containerName="oc" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.780844 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7379e714-d0c4-42b9-9f4a-d2b246816b39" containerName="registry-server" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.782895 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.795455 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.808545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.808721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.808801 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78qz\" (UniqueName: \"kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.911132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.911195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78qz\" (UniqueName: \"kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.911469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.912084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.912133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:20 crc kubenswrapper[4717]: I0308 06:37:20.942434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78qz\" (UniqueName: \"kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz\") pod \"certified-operators-cm7fx\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:21 crc kubenswrapper[4717]: I0308 06:37:21.106271 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:21 crc kubenswrapper[4717]: I0308 06:37:21.606010 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:21 crc kubenswrapper[4717]: I0308 06:37:21.648677 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerStarted","Data":"29628b62b7f988e76aef8f58e50afb63aad69bb02699b681ea0868a1fa495d84"} Mar 08 06:37:22 crc kubenswrapper[4717]: I0308 06:37:22.660282 4717 generic.go:334] "Generic (PLEG): container finished" podID="d77dda77-990d-47ce-92e3-8ab524225c24" containerID="ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2" exitCode=0 Mar 08 06:37:22 crc kubenswrapper[4717]: I0308 06:37:22.660776 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerDied","Data":"ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2"} Mar 08 06:37:24 crc kubenswrapper[4717]: I0308 06:37:24.690595 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerStarted","Data":"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e"} Mar 08 06:37:28 crc kubenswrapper[4717]: I0308 06:37:28.733379 4717 generic.go:334] "Generic (PLEG): container finished" podID="d77dda77-990d-47ce-92e3-8ab524225c24" containerID="d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e" exitCode=0 Mar 08 06:37:28 crc kubenswrapper[4717]: I0308 06:37:28.733456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerDied","Data":"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e"} Mar 08 06:37:37 crc kubenswrapper[4717]: I0308 06:37:37.895012 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerStarted","Data":"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025"} Mar 08 06:37:37 crc kubenswrapper[4717]: I0308 06:37:37.921653 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cm7fx" podStartSLOduration=4.365998107 podStartE2EDuration="17.921631105s" podCreationTimestamp="2026-03-08 06:37:20 +0000 UTC" firstStartedPulling="2026-03-08 06:37:22.667030413 +0000 UTC m=+4269.584679297" lastFinishedPulling="2026-03-08 06:37:36.222663411 +0000 UTC m=+4283.140312295" observedRunningTime="2026-03-08 06:37:37.913041844 +0000 UTC m=+4284.830690688" watchObservedRunningTime="2026-03-08 06:37:37.921631105 +0000 UTC m=+4284.839279949" Mar 08 06:37:41 crc kubenswrapper[4717]: I0308 06:37:41.107348 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:41 crc kubenswrapper[4717]: I0308 06:37:41.107966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:42 crc kubenswrapper[4717]: I0308 06:37:42.177329 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cm7fx" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="registry-server" probeResult="failure" output=< Mar 08 06:37:42 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:37:42 crc kubenswrapper[4717]: > Mar 08 06:37:51 crc kubenswrapper[4717]: I0308 06:37:51.255229 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:51 crc kubenswrapper[4717]: I0308 06:37:51.316381 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:51 crc kubenswrapper[4717]: I0308 06:37:51.971102 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.066365 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cm7fx" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="registry-server" containerID="cri-o://5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025" gracePeriod=2 Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.661921 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.835085 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities\") pod \"d77dda77-990d-47ce-92e3-8ab524225c24\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.835167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content\") pod \"d77dda77-990d-47ce-92e3-8ab524225c24\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.835215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78qz\" (UniqueName: \"kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz\") pod \"d77dda77-990d-47ce-92e3-8ab524225c24\" (UID: \"d77dda77-990d-47ce-92e3-8ab524225c24\") " Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.838153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities" (OuterVolumeSpecName: "utilities") pod "d77dda77-990d-47ce-92e3-8ab524225c24" (UID: "d77dda77-990d-47ce-92e3-8ab524225c24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.850522 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz" (OuterVolumeSpecName: "kube-api-access-d78qz") pod "d77dda77-990d-47ce-92e3-8ab524225c24" (UID: "d77dda77-990d-47ce-92e3-8ab524225c24"). InnerVolumeSpecName "kube-api-access-d78qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.937987 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.938229 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78qz\" (UniqueName: \"kubernetes.io/projected/d77dda77-990d-47ce-92e3-8ab524225c24-kube-api-access-d78qz\") on node \"crc\" DevicePath \"\"" Mar 08 06:37:53 crc kubenswrapper[4717]: I0308 06:37:53.947480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d77dda77-990d-47ce-92e3-8ab524225c24" (UID: "d77dda77-990d-47ce-92e3-8ab524225c24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.040166 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77dda77-990d-47ce-92e3-8ab524225c24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.074845 4717 generic.go:334] "Generic (PLEG): container finished" podID="d77dda77-990d-47ce-92e3-8ab524225c24" containerID="5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025" exitCode=0 Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.074887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerDied","Data":"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025"} Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.074915 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cm7fx" event={"ID":"d77dda77-990d-47ce-92e3-8ab524225c24","Type":"ContainerDied","Data":"29628b62b7f988e76aef8f58e50afb63aad69bb02699b681ea0868a1fa495d84"} Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.074920 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cm7fx" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.074949 4717 scope.go:117] "RemoveContainer" containerID="5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.092610 4717 scope.go:117] "RemoveContainer" containerID="d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.109075 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.117525 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cm7fx"] Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.145074 4717 scope.go:117] "RemoveContainer" containerID="ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.163756 4717 scope.go:117] "RemoveContainer" containerID="5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025" Mar 08 06:37:54 crc kubenswrapper[4717]: E0308 06:37:54.164093 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025\": container with ID starting with 5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025 not found: ID does not exist" containerID="5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.164155 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025"} err="failed to get container status \"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025\": rpc error: code = NotFound desc = could not find container \"5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025\": container with ID starting with 5d46458d3ba0aa160850e1c8ea3ba060c50618b6d60b0787ec5469cfdbd5f025 not found: ID does not exist" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.164219 4717 scope.go:117] "RemoveContainer" containerID="d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e" Mar 08 06:37:54 crc kubenswrapper[4717]: E0308 06:37:54.164514 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e\": container with ID starting with d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e not found: ID does not exist" containerID="d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.164544 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e"} err="failed to get container status \"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e\": rpc error: code = NotFound desc = could not find container \"d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e\": container with ID starting with d752f831318c031083a6b7b6e093c0e0c5ad2b83cd2e0df7114ae97d8fee3c6e not found: ID does not exist" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.164567 4717 scope.go:117] "RemoveContainer" containerID="ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2" Mar 08 06:37:54 crc kubenswrapper[4717]: E0308 06:37:54.164966 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2\": container with ID starting with ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2 not found: ID does not exist" containerID="ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2" Mar 08 06:37:54 crc kubenswrapper[4717]: I0308 06:37:54.165000 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2"} err="failed to get container status \"ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2\": rpc error: code = NotFound desc = could not find container \"ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2\": container with ID starting with ae84ef26d01653900e6afdc2ddf10e96c4cd7a636be22fd9b3153569b38a39e2 not found: ID does not exist" Mar 08 06:37:55 crc kubenswrapper[4717]: I0308 06:37:55.801910 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" path="/var/lib/kubelet/pods/d77dda77-990d-47ce-92e3-8ab524225c24/volumes" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.154602 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549198-6xc2v"] Mar 08 06:38:00 crc kubenswrapper[4717]: E0308 06:38:00.155755 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="extract-utilities" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.155773 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="extract-utilities" Mar 08 06:38:00 crc kubenswrapper[4717]: E0308 06:38:00.155806 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="registry-server" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.155814 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="registry-server" Mar 08 06:38:00 crc kubenswrapper[4717]: E0308 06:38:00.155838 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="extract-content" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.155846 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="extract-content" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.156092 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77dda77-990d-47ce-92e3-8ab524225c24" containerName="registry-server" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.156927 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.159889 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.160017 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.166631 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.174931 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549198-6xc2v"] Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.220874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rf95\" (UniqueName: \"kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95\") pod \"auto-csr-approver-29549198-6xc2v\" (UID: \"60797c97-784d-4608-afe0-111159210b6a\") " pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.323238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rf95\" (UniqueName: \"kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95\") pod \"auto-csr-approver-29549198-6xc2v\" (UID: \"60797c97-784d-4608-afe0-111159210b6a\") " pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.354386 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rf95\" (UniqueName: \"kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95\") pod \"auto-csr-approver-29549198-6xc2v\" (UID: \"60797c97-784d-4608-afe0-111159210b6a\") " pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.511665 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:00 crc kubenswrapper[4717]: I0308 06:38:00.994071 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549198-6xc2v"] Mar 08 06:38:01 crc kubenswrapper[4717]: W0308 06:38:01.691793 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60797c97_784d_4608_afe0_111159210b6a.slice/crio-2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733 WatchSource:0}: Error finding container 2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733: Status 404 returned error can't find the container with id 2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733 Mar 08 06:38:02 crc kubenswrapper[4717]: I0308 06:38:02.171530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" event={"ID":"60797c97-784d-4608-afe0-111159210b6a","Type":"ContainerStarted","Data":"2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733"} Mar 08 06:38:04 crc kubenswrapper[4717]: I0308 06:38:04.120282 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:38:04 crc kubenswrapper[4717]: I0308 06:38:04.120584 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:38:05 crc kubenswrapper[4717]: I0308 06:38:05.195154 4717 generic.go:334] "Generic (PLEG): container finished" podID="60797c97-784d-4608-afe0-111159210b6a" containerID="481fdc3460af8622a4b1e1d8865a983573294da193424323bb4303efe5064c46" exitCode=0 Mar 08 06:38:05 crc kubenswrapper[4717]: I0308 06:38:05.195242 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" event={"ID":"60797c97-784d-4608-afe0-111159210b6a","Type":"ContainerDied","Data":"481fdc3460af8622a4b1e1d8865a983573294da193424323bb4303efe5064c46"} Mar 08 06:38:06 crc kubenswrapper[4717]: I0308 06:38:06.655716 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:06 crc kubenswrapper[4717]: I0308 06:38:06.774888 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rf95\" (UniqueName: \"kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95\") pod \"60797c97-784d-4608-afe0-111159210b6a\" (UID: \"60797c97-784d-4608-afe0-111159210b6a\") " Mar 08 06:38:06 crc kubenswrapper[4717]: I0308 06:38:06.780676 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95" (OuterVolumeSpecName: "kube-api-access-9rf95") pod "60797c97-784d-4608-afe0-111159210b6a" (UID: "60797c97-784d-4608-afe0-111159210b6a"). InnerVolumeSpecName "kube-api-access-9rf95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:38:06 crc kubenswrapper[4717]: I0308 06:38:06.877217 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rf95\" (UniqueName: \"kubernetes.io/projected/60797c97-784d-4608-afe0-111159210b6a-kube-api-access-9rf95\") on node \"crc\" DevicePath \"\"" Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.225336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" event={"ID":"60797c97-784d-4608-afe0-111159210b6a","Type":"ContainerDied","Data":"2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733"} Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.225386 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2150ccbab655742ace2126de8faa160cd7151bc3c27cb23db92ee29d0b5a5733" Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.225598 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549198-6xc2v" Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.730273 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549192-shjql"] Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.739341 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549192-shjql"] Mar 08 06:38:07 crc kubenswrapper[4717]: I0308 06:38:07.798061 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca011fc0-86f8-4a30-b556-bb853740acab" path="/var/lib/kubelet/pods/ca011fc0-86f8-4a30-b556-bb853740acab/volumes" Mar 08 06:38:34 crc kubenswrapper[4717]: I0308 06:38:34.120233 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:38:34 crc kubenswrapper[4717]: I0308 06:38:34.120757 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:38:39 crc kubenswrapper[4717]: I0308 06:38:39.206932 4717 scope.go:117] "RemoveContainer" containerID="48899c95f83c04e32d18cb58f07477aa1651aa838367d7099b1d207740ecb2de" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.119759 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.120415 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.120482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.121452 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.121565 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" gracePeriod=600 Mar 08 06:39:04 crc kubenswrapper[4717]: E0308 06:39:04.246042 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.893104 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" exitCode=0 Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.893148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce"} Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.893184 4717 scope.go:117] "RemoveContainer" containerID="08cc9c9ff5d454a3241f318529e4f53ef0a3ee0906337538951253195cd89ef9" Mar 08 06:39:04 crc kubenswrapper[4717]: I0308 06:39:04.893970 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:39:04 crc kubenswrapper[4717]: E0308 06:39:04.894327 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:39:18 crc kubenswrapper[4717]: I0308 06:39:18.781883 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:39:18 crc kubenswrapper[4717]: E0308 06:39:18.782896 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:39:29 crc kubenswrapper[4717]: I0308 06:39:29.781851 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:39:29 crc kubenswrapper[4717]: E0308 06:39:29.782718 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:39:44 crc kubenswrapper[4717]: I0308 06:39:44.782335 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:39:44 crc kubenswrapper[4717]: E0308 06:39:44.783430 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:39:55 crc kubenswrapper[4717]: I0308 06:39:55.781400 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:39:55 crc kubenswrapper[4717]: E0308 06:39:55.782398 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.170857 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549200-zph9l"] Mar 08 06:40:00 crc kubenswrapper[4717]: E0308 06:40:00.174408 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60797c97-784d-4608-afe0-111159210b6a" containerName="oc" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.174626 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="60797c97-784d-4608-afe0-111159210b6a" containerName="oc" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.175343 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="60797c97-784d-4608-afe0-111159210b6a" containerName="oc" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.177245 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.183018 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.184047 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.184130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.184761 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549200-zph9l"] Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.258016 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwh28\" (UniqueName: \"kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28\") pod \"auto-csr-approver-29549200-zph9l\" (UID: \"21870681-96ec-4a38-8f09-a774da82e554\") " pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.360316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwh28\" (UniqueName: \"kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28\") pod \"auto-csr-approver-29549200-zph9l\" (UID: \"21870681-96ec-4a38-8f09-a774da82e554\") " pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.380870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwh28\" (UniqueName: \"kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28\") pod \"auto-csr-approver-29549200-zph9l\" (UID: \"21870681-96ec-4a38-8f09-a774da82e554\") " pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:00 crc kubenswrapper[4717]: I0308 06:40:00.507764 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:01 crc kubenswrapper[4717]: I0308 06:40:01.026233 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549200-zph9l"] Mar 08 06:40:01 crc kubenswrapper[4717]: I0308 06:40:01.026295 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:40:01 crc kubenswrapper[4717]: I0308 06:40:01.554578 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549200-zph9l" event={"ID":"21870681-96ec-4a38-8f09-a774da82e554","Type":"ContainerStarted","Data":"504d76b62f4b1a8a4740af5ef9c7d8f484c59f7b025365b91a930487f87edec2"} Mar 08 06:40:02 crc kubenswrapper[4717]: I0308 06:40:02.565906 4717 generic.go:334] "Generic (PLEG): container finished" podID="21870681-96ec-4a38-8f09-a774da82e554" containerID="67464bcc1413496b67a7c4deafa90eb6a6125f6f4703ee0099db97108ca8fa1a" exitCode=0 Mar 08 06:40:02 crc kubenswrapper[4717]: I0308 06:40:02.565994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549200-zph9l" event={"ID":"21870681-96ec-4a38-8f09-a774da82e554","Type":"ContainerDied","Data":"67464bcc1413496b67a7c4deafa90eb6a6125f6f4703ee0099db97108ca8fa1a"} Mar 08 06:40:03 crc kubenswrapper[4717]: I0308 06:40:03.976854 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.075467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwh28\" (UniqueName: \"kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28\") pod \"21870681-96ec-4a38-8f09-a774da82e554\" (UID: \"21870681-96ec-4a38-8f09-a774da82e554\") " Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.084842 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28" (OuterVolumeSpecName: "kube-api-access-pwh28") pod "21870681-96ec-4a38-8f09-a774da82e554" (UID: "21870681-96ec-4a38-8f09-a774da82e554"). InnerVolumeSpecName "kube-api-access-pwh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.178126 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwh28\" (UniqueName: \"kubernetes.io/projected/21870681-96ec-4a38-8f09-a774da82e554-kube-api-access-pwh28\") on node \"crc\" DevicePath \"\"" Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.586826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549200-zph9l" event={"ID":"21870681-96ec-4a38-8f09-a774da82e554","Type":"ContainerDied","Data":"504d76b62f4b1a8a4740af5ef9c7d8f484c59f7b025365b91a930487f87edec2"} Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.586870 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="504d76b62f4b1a8a4740af5ef9c7d8f484c59f7b025365b91a930487f87edec2" Mar 08 06:40:04 crc kubenswrapper[4717]: I0308 06:40:04.586907 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549200-zph9l" Mar 08 06:40:05 crc kubenswrapper[4717]: I0308 06:40:05.080041 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549194-bw44x"] Mar 08 06:40:05 crc kubenswrapper[4717]: I0308 06:40:05.091676 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549194-bw44x"] Mar 08 06:40:05 crc kubenswrapper[4717]: I0308 06:40:05.797050 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3" path="/var/lib/kubelet/pods/c2d5f6d8-cd5f-4ed7-8d7c-bd106207b0b3/volumes" Mar 08 06:40:08 crc kubenswrapper[4717]: I0308 06:40:08.782747 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:40:08 crc kubenswrapper[4717]: E0308 06:40:08.783636 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:40:22 crc kubenswrapper[4717]: I0308 06:40:22.782275 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:40:22 crc kubenswrapper[4717]: E0308 06:40:22.783616 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:40:37 crc kubenswrapper[4717]: I0308 06:40:37.783295 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:40:37 crc kubenswrapper[4717]: E0308 06:40:37.788755 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:40:39 crc kubenswrapper[4717]: I0308 06:40:39.341447 4717 scope.go:117] "RemoveContainer" containerID="38d7f7632f4928a26cdbfbcf48e6fe99fbad6712dfd8cbf49f7c3f6d7c9840fd" Mar 08 06:40:51 crc kubenswrapper[4717]: I0308 06:40:51.782356 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:40:51 crc kubenswrapper[4717]: E0308 06:40:51.783525 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:41:02 crc kubenswrapper[4717]: I0308 06:41:02.782727 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:41:02 crc kubenswrapper[4717]: E0308 06:41:02.783853 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:41:17 crc kubenswrapper[4717]: I0308 06:41:17.782861 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:41:17 crc kubenswrapper[4717]: E0308 06:41:17.785387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:41:29 crc kubenswrapper[4717]: I0308 06:41:29.782496 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:41:29 crc kubenswrapper[4717]: E0308 06:41:29.783549 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:41:41 crc kubenswrapper[4717]: E0308 06:41:41.397575 4717 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.44:41330->38.102.83.44:42899: read tcp 38.102.83.44:41330->38.102.83.44:42899: read: connection reset by peer Mar 08 06:41:44 crc kubenswrapper[4717]: I0308 06:41:44.783058 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:41:44 crc kubenswrapper[4717]: E0308 06:41:44.784406 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:41:56 crc kubenswrapper[4717]: I0308 06:41:56.782333 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:41:56 crc kubenswrapper[4717]: E0308 06:41:56.783077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.156117 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549202-wcbhf"] Mar 08 06:42:00 crc kubenswrapper[4717]: E0308 06:42:00.158070 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21870681-96ec-4a38-8f09-a774da82e554" containerName="oc" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.158194 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="21870681-96ec-4a38-8f09-a774da82e554" containerName="oc" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.158536 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="21870681-96ec-4a38-8f09-a774da82e554" containerName="oc" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.159560 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.161597 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.164317 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.164880 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.178983 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549202-wcbhf"] Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.314729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfp5\" (UniqueName: \"kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5\") pod \"auto-csr-approver-29549202-wcbhf\" (UID: \"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683\") " pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.416463 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfp5\" (UniqueName: \"kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5\") pod \"auto-csr-approver-29549202-wcbhf\" (UID: \"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683\") " pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.436846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfp5\" (UniqueName: \"kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5\") pod \"auto-csr-approver-29549202-wcbhf\" (UID: \"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683\") " pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:00 crc kubenswrapper[4717]: I0308 06:42:00.480564 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:01 crc kubenswrapper[4717]: I0308 06:42:01.657239 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549202-wcbhf"] Mar 08 06:42:01 crc kubenswrapper[4717]: I0308 06:42:01.887237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" event={"ID":"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683","Type":"ContainerStarted","Data":"cc5e4606b009ac8552d17fb7c324d8e9e583dd6d36d78d2ee15cd1f5dd1d7499"} Mar 08 06:42:02 crc kubenswrapper[4717]: I0308 06:42:02.896803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" event={"ID":"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683","Type":"ContainerStarted","Data":"76cd506abf9d822f053d10014620a16c9d63f125d5157645a40efa080fa682b9"} Mar 08 06:42:02 crc kubenswrapper[4717]: I0308 06:42:02.924259 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" podStartSLOduration=2.037673472 podStartE2EDuration="2.924241214s" podCreationTimestamp="2026-03-08 06:42:00 +0000 UTC" firstStartedPulling="2026-03-08 06:42:01.661712429 +0000 UTC m=+4548.579361293" lastFinishedPulling="2026-03-08 06:42:02.548280151 +0000 UTC m=+4549.465929035" observedRunningTime="2026-03-08 06:42:02.913146632 +0000 UTC m=+4549.830795476" watchObservedRunningTime="2026-03-08 06:42:02.924241214 +0000 UTC m=+4549.841890058" Mar 08 06:42:03 crc kubenswrapper[4717]: I0308 06:42:03.920430 4717 generic.go:334] "Generic (PLEG): container finished" podID="d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" containerID="76cd506abf9d822f053d10014620a16c9d63f125d5157645a40efa080fa682b9" exitCode=0 Mar 08 06:42:03 crc kubenswrapper[4717]: I0308 06:42:03.920493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" event={"ID":"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683","Type":"ContainerDied","Data":"76cd506abf9d822f053d10014620a16c9d63f125d5157645a40efa080fa682b9"} Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.299897 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.428406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnfp5\" (UniqueName: \"kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5\") pod \"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683\" (UID: \"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683\") " Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.433633 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5" (OuterVolumeSpecName: "kube-api-access-nnfp5") pod "d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" (UID: "d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683"). InnerVolumeSpecName "kube-api-access-nnfp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.531528 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnfp5\" (UniqueName: \"kubernetes.io/projected/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683-kube-api-access-nnfp5\") on node \"crc\" DevicePath \"\"" Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.942241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" event={"ID":"d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683","Type":"ContainerDied","Data":"cc5e4606b009ac8552d17fb7c324d8e9e583dd6d36d78d2ee15cd1f5dd1d7499"} Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.942278 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5e4606b009ac8552d17fb7c324d8e9e583dd6d36d78d2ee15cd1f5dd1d7499" Mar 08 06:42:05 crc kubenswrapper[4717]: I0308 06:42:05.942325 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549202-wcbhf" Mar 08 06:42:06 crc kubenswrapper[4717]: I0308 06:42:06.003988 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549196-dgglx"] Mar 08 06:42:06 crc kubenswrapper[4717]: I0308 06:42:06.012075 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549196-dgglx"] Mar 08 06:42:07 crc kubenswrapper[4717]: I0308 06:42:07.791959 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cad26e-953d-4480-a4e2-29568afd1e00" path="/var/lib/kubelet/pods/94cad26e-953d-4480-a4e2-29568afd1e00/volumes" Mar 08 06:42:10 crc kubenswrapper[4717]: I0308 06:42:10.781410 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:42:10 crc kubenswrapper[4717]: E0308 06:42:10.781898 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:42:22 crc kubenswrapper[4717]: I0308 06:42:22.781850 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:42:22 crc kubenswrapper[4717]: E0308 06:42:22.782648 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:42:34 crc kubenswrapper[4717]: I0308 06:42:34.782027 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:42:34 crc kubenswrapper[4717]: E0308 06:42:34.782904 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:42:39 crc kubenswrapper[4717]: I0308 06:42:39.475214 4717 scope.go:117] "RemoveContainer" containerID="869d3f8162ac35fa33c6c280b9a9c6ea1b4d1e1a76f43020c5afe7ed122b80a2" Mar 08 06:42:49 crc kubenswrapper[4717]: I0308 06:42:49.782560 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:42:49 crc kubenswrapper[4717]: E0308 06:42:49.783572 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:43:00 crc kubenswrapper[4717]: I0308 06:43:00.781847 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:43:00 crc kubenswrapper[4717]: E0308 06:43:00.782530 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.314871 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:01 crc kubenswrapper[4717]: E0308 06:43:01.315484 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" containerName="oc" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.315507 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" containerName="oc" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.315757 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" containerName="oc" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.317633 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.338013 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.396125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.396271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjw82\" (UniqueName: \"kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.396340 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.498738 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.498839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjw82\" (UniqueName: \"kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.498884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.499450 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.499464 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.522560 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjw82\" (UniqueName: \"kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82\") pod \"redhat-marketplace-bnrhm\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:01 crc kubenswrapper[4717]: I0308 06:43:01.650519 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:02 crc kubenswrapper[4717]: I0308 06:43:02.110172 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:02 crc kubenswrapper[4717]: I0308 06:43:02.538558 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerID="c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d" exitCode=0 Mar 08 06:43:02 crc kubenswrapper[4717]: I0308 06:43:02.538615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerDied","Data":"c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d"} Mar 08 06:43:02 crc kubenswrapper[4717]: I0308 06:43:02.538674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerStarted","Data":"3de2ca0ca45d4f29edc15184f3e0f2e48b483874792c45ab71b8e92bc4fbcc02"} Mar 08 06:43:03 crc kubenswrapper[4717]: I0308 06:43:03.550257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerStarted","Data":"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1"} Mar 08 06:43:04 crc kubenswrapper[4717]: I0308 06:43:04.581244 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerID="d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1" exitCode=0 Mar 08 06:43:04 crc kubenswrapper[4717]: I0308 06:43:04.581342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerDied","Data":"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1"} Mar 08 06:43:05 crc kubenswrapper[4717]: I0308 06:43:05.597910 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerStarted","Data":"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef"} Mar 08 06:43:05 crc kubenswrapper[4717]: I0308 06:43:05.620723 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnrhm" podStartSLOduration=2.172754974 podStartE2EDuration="4.62070239s" podCreationTimestamp="2026-03-08 06:43:01 +0000 UTC" firstStartedPulling="2026-03-08 06:43:02.540179999 +0000 UTC m=+4609.457828883" lastFinishedPulling="2026-03-08 06:43:04.988127455 +0000 UTC m=+4611.905776299" observedRunningTime="2026-03-08 06:43:05.619151842 +0000 UTC m=+4612.536800696" watchObservedRunningTime="2026-03-08 06:43:05.62070239 +0000 UTC m=+4612.538351234" Mar 08 06:43:11 crc kubenswrapper[4717]: I0308 06:43:11.651568 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:11 crc kubenswrapper[4717]: I0308 06:43:11.652112 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:11 crc kubenswrapper[4717]: I0308 06:43:11.716248 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:11 crc kubenswrapper[4717]: I0308 06:43:11.768056 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:11 crc kubenswrapper[4717]: I0308 06:43:11.956162 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:12 crc kubenswrapper[4717]: I0308 06:43:12.782963 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:43:12 crc kubenswrapper[4717]: E0308 06:43:12.783325 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:43:13 crc kubenswrapper[4717]: I0308 06:43:13.716409 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnrhm" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="registry-server" containerID="cri-o://c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef" gracePeriod=2 Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.159634 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.267139 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities\") pod \"3b55fb65-fed7-4302-8832-5c8a71919d57\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.267231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjw82\" (UniqueName: \"kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82\") pod \"3b55fb65-fed7-4302-8832-5c8a71919d57\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.267261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content\") pod \"3b55fb65-fed7-4302-8832-5c8a71919d57\" (UID: \"3b55fb65-fed7-4302-8832-5c8a71919d57\") " Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.268174 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities" (OuterVolumeSpecName: "utilities") pod "3b55fb65-fed7-4302-8832-5c8a71919d57" (UID: "3b55fb65-fed7-4302-8832-5c8a71919d57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.275610 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82" (OuterVolumeSpecName: "kube-api-access-rjw82") pod "3b55fb65-fed7-4302-8832-5c8a71919d57" (UID: "3b55fb65-fed7-4302-8832-5c8a71919d57"). InnerVolumeSpecName "kube-api-access-rjw82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.294565 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b55fb65-fed7-4302-8832-5c8a71919d57" (UID: "3b55fb65-fed7-4302-8832-5c8a71919d57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.370068 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.370102 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjw82\" (UniqueName: \"kubernetes.io/projected/3b55fb65-fed7-4302-8832-5c8a71919d57-kube-api-access-rjw82\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.370118 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b55fb65-fed7-4302-8832-5c8a71919d57-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.730951 4717 generic.go:334] "Generic (PLEG): container finished" podID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerID="c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef" exitCode=0 Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.731040 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnrhm" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.731035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerDied","Data":"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef"} Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.731745 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnrhm" event={"ID":"3b55fb65-fed7-4302-8832-5c8a71919d57","Type":"ContainerDied","Data":"3de2ca0ca45d4f29edc15184f3e0f2e48b483874792c45ab71b8e92bc4fbcc02"} Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.731792 4717 scope.go:117] "RemoveContainer" containerID="c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.775058 4717 scope.go:117] "RemoveContainer" containerID="d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.779194 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.789446 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnrhm"] Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.807461 4717 scope.go:117] "RemoveContainer" containerID="c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.882976 4717 scope.go:117] "RemoveContainer" containerID="c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef" Mar 08 06:43:14 crc kubenswrapper[4717]: E0308 06:43:14.883522 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef\": container with ID starting with c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef not found: ID does not exist" containerID="c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.883615 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef"} err="failed to get container status \"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef\": rpc error: code = NotFound desc = could not find container \"c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef\": container with ID starting with c778ecf90b145af08a936de932f2f6ced1c08e3105d453e5ea1be7113452f8ef not found: ID does not exist" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.883721 4717 scope.go:117] "RemoveContainer" containerID="d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1" Mar 08 06:43:14 crc kubenswrapper[4717]: E0308 06:43:14.884144 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1\": container with ID starting with d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1 not found: ID does not exist" containerID="d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.884200 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1"} err="failed to get container status \"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1\": rpc error: code = NotFound desc = could not find container \"d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1\": container with ID starting with d6848b51e7b6b3934a9682b0b6e1c16db16152963be54d1ac98b17e113b3ede1 not found: ID does not exist" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.884240 4717 scope.go:117] "RemoveContainer" containerID="c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d" Mar 08 06:43:14 crc kubenswrapper[4717]: E0308 06:43:14.887353 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d\": container with ID starting with c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d not found: ID does not exist" containerID="c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d" Mar 08 06:43:14 crc kubenswrapper[4717]: I0308 06:43:14.887402 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d"} err="failed to get container status \"c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d\": rpc error: code = NotFound desc = could not find container \"c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d\": container with ID starting with c0bc0f70a75df6b2f0c57d5d427f67650f1efda1aaa48422f1f2a5cf62207e7d not found: ID does not exist" Mar 08 06:43:15 crc kubenswrapper[4717]: I0308 06:43:15.800027 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" path="/var/lib/kubelet/pods/3b55fb65-fed7-4302-8832-5c8a71919d57/volumes" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.372577 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:23 crc kubenswrapper[4717]: E0308 06:43:23.373666 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="registry-server" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.373699 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="registry-server" Mar 08 06:43:23 crc kubenswrapper[4717]: E0308 06:43:23.373725 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="extract-content" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.373733 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="extract-content" Mar 08 06:43:23 crc kubenswrapper[4717]: E0308 06:43:23.373752 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="extract-utilities" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.373760 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="extract-utilities" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.374019 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b55fb65-fed7-4302-8832-5c8a71919d57" containerName="registry-server" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.378390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.387759 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.561631 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.561809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5skx\" (UniqueName: \"kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.562039 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.664545 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.664871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5skx\" (UniqueName: \"kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.665066 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.665297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.665571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.685634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5skx\" (UniqueName: \"kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx\") pod \"redhat-operators-sv56l\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:23 crc kubenswrapper[4717]: I0308 06:43:23.955050 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:24 crc kubenswrapper[4717]: I0308 06:43:24.589887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:25 crc kubenswrapper[4717]: I0308 06:43:25.079624 4717 generic.go:334] "Generic (PLEG): container finished" podID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerID="fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a" exitCode=0 Mar 08 06:43:25 crc kubenswrapper[4717]: I0308 06:43:25.079740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerDied","Data":"fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a"} Mar 08 06:43:25 crc kubenswrapper[4717]: I0308 06:43:25.079932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerStarted","Data":"d24158ad19bc7801fa88d7bbf4e1c8629212a1d15f593ad78ac0c028165f226e"} Mar 08 06:43:26 crc kubenswrapper[4717]: I0308 06:43:26.095680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerStarted","Data":"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598"} Mar 08 06:43:27 crc kubenswrapper[4717]: I0308 06:43:27.782993 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:43:27 crc kubenswrapper[4717]: E0308 06:43:27.784042 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:43:32 crc kubenswrapper[4717]: I0308 06:43:32.153356 4717 generic.go:334] "Generic (PLEG): container finished" podID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerID="2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598" exitCode=0 Mar 08 06:43:32 crc kubenswrapper[4717]: I0308 06:43:32.153428 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerDied","Data":"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598"} Mar 08 06:43:33 crc kubenswrapper[4717]: I0308 06:43:33.169961 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerStarted","Data":"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca"} Mar 08 06:43:33 crc kubenswrapper[4717]: I0308 06:43:33.203216 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sv56l" podStartSLOduration=2.526937426 podStartE2EDuration="10.203192289s" podCreationTimestamp="2026-03-08 06:43:23 +0000 UTC" firstStartedPulling="2026-03-08 06:43:25.081866077 +0000 UTC m=+4631.999514921" lastFinishedPulling="2026-03-08 06:43:32.75812094 +0000 UTC m=+4639.675769784" observedRunningTime="2026-03-08 06:43:33.189491063 +0000 UTC m=+4640.107139937" watchObservedRunningTime="2026-03-08 06:43:33.203192289 +0000 UTC m=+4640.120841143" Mar 08 06:43:33 crc kubenswrapper[4717]: I0308 06:43:33.955990 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:33 crc kubenswrapper[4717]: I0308 06:43:33.956026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:35 crc kubenswrapper[4717]: I0308 06:43:35.031379 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sv56l" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="registry-server" probeResult="failure" output=< Mar 08 06:43:35 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 06:43:35 crc kubenswrapper[4717]: > Mar 08 06:43:40 crc kubenswrapper[4717]: I0308 06:43:40.783283 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:43:40 crc kubenswrapper[4717]: E0308 06:43:40.785188 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:43:44 crc kubenswrapper[4717]: I0308 06:43:44.001594 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:44 crc kubenswrapper[4717]: I0308 06:43:44.424436 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:44 crc kubenswrapper[4717]: I0308 06:43:44.477603 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.302288 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sv56l" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="registry-server" containerID="cri-o://ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca" gracePeriod=2 Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.820259 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.911066 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content\") pod \"e05d178e-e7e5-4662-bebe-8f83520da6f3\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.911211 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5skx\" (UniqueName: \"kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx\") pod \"e05d178e-e7e5-4662-bebe-8f83520da6f3\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.911279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities\") pod \"e05d178e-e7e5-4662-bebe-8f83520da6f3\" (UID: \"e05d178e-e7e5-4662-bebe-8f83520da6f3\") " Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.913014 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities" (OuterVolumeSpecName: "utilities") pod "e05d178e-e7e5-4662-bebe-8f83520da6f3" (UID: "e05d178e-e7e5-4662-bebe-8f83520da6f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:43:45 crc kubenswrapper[4717]: I0308 06:43:45.921484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx" (OuterVolumeSpecName: "kube-api-access-d5skx") pod "e05d178e-e7e5-4662-bebe-8f83520da6f3" (UID: "e05d178e-e7e5-4662-bebe-8f83520da6f3"). InnerVolumeSpecName "kube-api-access-d5skx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.019458 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5skx\" (UniqueName: \"kubernetes.io/projected/e05d178e-e7e5-4662-bebe-8f83520da6f3-kube-api-access-d5skx\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.019497 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.074923 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e05d178e-e7e5-4662-bebe-8f83520da6f3" (UID: "e05d178e-e7e5-4662-bebe-8f83520da6f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.121232 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e05d178e-e7e5-4662-bebe-8f83520da6f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.318320 4717 generic.go:334] "Generic (PLEG): container finished" podID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerID="ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca" exitCode=0 Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.318371 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerDied","Data":"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca"} Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.318402 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sv56l" event={"ID":"e05d178e-e7e5-4662-bebe-8f83520da6f3","Type":"ContainerDied","Data":"d24158ad19bc7801fa88d7bbf4e1c8629212a1d15f593ad78ac0c028165f226e"} Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.318422 4717 scope.go:117] "RemoveContainer" containerID="ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.318533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sv56l" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.354618 4717 scope.go:117] "RemoveContainer" containerID="2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.383871 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.392394 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sv56l"] Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.405935 4717 scope.go:117] "RemoveContainer" containerID="fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.434769 4717 scope.go:117] "RemoveContainer" containerID="ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca" Mar 08 06:43:46 crc kubenswrapper[4717]: E0308 06:43:46.435472 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca\": container with ID starting with ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca not found: ID does not exist" containerID="ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.435510 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca"} err="failed to get container status \"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca\": rpc error: code = NotFound desc = could not find container \"ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca\": container with ID starting with ce4af8e2b7e2d2d18f3f160710e9b5df42c4f84ac8d177531427f1999df716ca not found: ID does not exist" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.435536 4717 scope.go:117] "RemoveContainer" containerID="2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598" Mar 08 06:43:46 crc kubenswrapper[4717]: E0308 06:43:46.435855 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598\": container with ID starting with 2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598 not found: ID does not exist" containerID="2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.435881 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598"} err="failed to get container status \"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598\": rpc error: code = NotFound desc = could not find container \"2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598\": container with ID starting with 2718a8afaa94c1d180051c52aee72ed785174acbb307859761dede88f9bdc598 not found: ID does not exist" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.435898 4717 scope.go:117] "RemoveContainer" containerID="fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a" Mar 08 06:43:46 crc kubenswrapper[4717]: E0308 06:43:46.436473 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a\": container with ID starting with fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a not found: ID does not exist" containerID="fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a" Mar 08 06:43:46 crc kubenswrapper[4717]: I0308 06:43:46.436506 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a"} err="failed to get container status \"fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a\": rpc error: code = NotFound desc = could not find container \"fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a\": container with ID starting with fd3de87b09e2391516fe9e038ec20147d31ebcb2fe28db16f06cde590d104e5a not found: ID does not exist" Mar 08 06:43:47 crc kubenswrapper[4717]: I0308 06:43:47.790839 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" path="/var/lib/kubelet/pods/e05d178e-e7e5-4662-bebe-8f83520da6f3/volumes" Mar 08 06:43:51 crc kubenswrapper[4717]: I0308 06:43:51.782610 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:43:51 crc kubenswrapper[4717]: E0308 06:43:51.785097 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.156124 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549204-kbtdp"] Mar 08 06:44:00 crc kubenswrapper[4717]: E0308 06:44:00.157475 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="extract-utilities" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.157498 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="extract-utilities" Mar 08 06:44:00 crc kubenswrapper[4717]: E0308 06:44:00.157568 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="registry-server" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.157580 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="registry-server" Mar 08 06:44:00 crc kubenswrapper[4717]: E0308 06:44:00.157602 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="extract-content" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.157616 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="extract-content" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.158043 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05d178e-e7e5-4662-bebe-8f83520da6f3" containerName="registry-server" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.159284 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.162994 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.166416 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.166893 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.167530 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549204-kbtdp"] Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.207163 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4tmp\" (UniqueName: \"kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp\") pod \"auto-csr-approver-29549204-kbtdp\" (UID: \"bca54edc-7e3d-4971-9146-fd74a53edd2e\") " pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.309057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4tmp\" (UniqueName: \"kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp\") pod \"auto-csr-approver-29549204-kbtdp\" (UID: \"bca54edc-7e3d-4971-9146-fd74a53edd2e\") " pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.331517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4tmp\" (UniqueName: \"kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp\") pod \"auto-csr-approver-29549204-kbtdp\" (UID: \"bca54edc-7e3d-4971-9146-fd74a53edd2e\") " pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.480560 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:00 crc kubenswrapper[4717]: I0308 06:44:00.941372 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549204-kbtdp"] Mar 08 06:44:01 crc kubenswrapper[4717]: I0308 06:44:01.476790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" event={"ID":"bca54edc-7e3d-4971-9146-fd74a53edd2e","Type":"ContainerStarted","Data":"9d47d4b9765a9be6b55707f08763535b8879a693817215584ee6c8c905897abf"} Mar 08 06:44:02 crc kubenswrapper[4717]: I0308 06:44:02.489456 4717 generic.go:334] "Generic (PLEG): container finished" podID="bca54edc-7e3d-4971-9146-fd74a53edd2e" containerID="6faf5b28bc81deaeea4c7701f6ec7abd56f3002c7ae6106616e2598406997bb0" exitCode=0 Mar 08 06:44:02 crc kubenswrapper[4717]: I0308 06:44:02.489515 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" event={"ID":"bca54edc-7e3d-4971-9146-fd74a53edd2e","Type":"ContainerDied","Data":"6faf5b28bc81deaeea4c7701f6ec7abd56f3002c7ae6106616e2598406997bb0"} Mar 08 06:44:03 crc kubenswrapper[4717]: I0308 06:44:03.887224 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:03 crc kubenswrapper[4717]: I0308 06:44:03.991185 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4tmp\" (UniqueName: \"kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp\") pod \"bca54edc-7e3d-4971-9146-fd74a53edd2e\" (UID: \"bca54edc-7e3d-4971-9146-fd74a53edd2e\") " Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.001996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp" (OuterVolumeSpecName: "kube-api-access-b4tmp") pod "bca54edc-7e3d-4971-9146-fd74a53edd2e" (UID: "bca54edc-7e3d-4971-9146-fd74a53edd2e"). InnerVolumeSpecName "kube-api-access-b4tmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.093503 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4tmp\" (UniqueName: \"kubernetes.io/projected/bca54edc-7e3d-4971-9146-fd74a53edd2e-kube-api-access-b4tmp\") on node \"crc\" DevicePath \"\"" Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.519354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" event={"ID":"bca54edc-7e3d-4971-9146-fd74a53edd2e","Type":"ContainerDied","Data":"9d47d4b9765a9be6b55707f08763535b8879a693817215584ee6c8c905897abf"} Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.519408 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d47d4b9765a9be6b55707f08763535b8879a693817215584ee6c8c905897abf" Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.519414 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549204-kbtdp" Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.782535 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.978066 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549198-6xc2v"] Mar 08 06:44:04 crc kubenswrapper[4717]: I0308 06:44:04.989122 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549198-6xc2v"] Mar 08 06:44:05 crc kubenswrapper[4717]: I0308 06:44:05.791537 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60797c97-784d-4608-afe0-111159210b6a" path="/var/lib/kubelet/pods/60797c97-784d-4608-afe0-111159210b6a/volumes" Mar 08 06:44:06 crc kubenswrapper[4717]: I0308 06:44:06.541554 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8"} Mar 08 06:44:39 crc kubenswrapper[4717]: I0308 06:44:39.595433 4717 scope.go:117] "RemoveContainer" containerID="481fdc3460af8622a4b1e1d8865a983573294da193424323bb4303efe5064c46" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.174187 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq"] Mar 08 06:45:00 crc kubenswrapper[4717]: E0308 06:45:00.177136 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca54edc-7e3d-4971-9146-fd74a53edd2e" containerName="oc" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.177385 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca54edc-7e3d-4971-9146-fd74a53edd2e" containerName="oc" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.177921 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca54edc-7e3d-4971-9146-fd74a53edd2e" containerName="oc" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.178899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.182472 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.183033 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.197345 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq"] Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.272177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.272387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.272926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrgc5\" (UniqueName: \"kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.375292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.375460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrgc5\" (UniqueName: \"kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.375514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.376502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.383385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.392618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrgc5\" (UniqueName: \"kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5\") pod \"collect-profiles-29549205-dtxwq\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:00 crc kubenswrapper[4717]: I0308 06:45:00.505303 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:01 crc kubenswrapper[4717]: I0308 06:45:00.980011 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq"] Mar 08 06:45:01 crc kubenswrapper[4717]: I0308 06:45:01.163283 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" event={"ID":"61b91d97-5f0f-4255-a8a4-6f35380e779e","Type":"ContainerStarted","Data":"ae0736b1a7d778d94c33f43b9ebe1747162cc50cd0aa4adc915c95557a9ce661"} Mar 08 06:45:02 crc kubenswrapper[4717]: I0308 06:45:02.178945 4717 generic.go:334] "Generic (PLEG): container finished" podID="61b91d97-5f0f-4255-a8a4-6f35380e779e" containerID="24c818828f479dba86efbb3239c9acfad1204ddcc4d34ce13dd15fc9fbac8fb1" exitCode=0 Mar 08 06:45:02 crc kubenswrapper[4717]: I0308 06:45:02.179206 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" event={"ID":"61b91d97-5f0f-4255-a8a4-6f35380e779e","Type":"ContainerDied","Data":"24c818828f479dba86efbb3239c9acfad1204ddcc4d34ce13dd15fc9fbac8fb1"} Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.595718 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.749640 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrgc5\" (UniqueName: \"kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5\") pod \"61b91d97-5f0f-4255-a8a4-6f35380e779e\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.750214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume\") pod \"61b91d97-5f0f-4255-a8a4-6f35380e779e\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.750335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume\") pod \"61b91d97-5f0f-4255-a8a4-6f35380e779e\" (UID: \"61b91d97-5f0f-4255-a8a4-6f35380e779e\") " Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.751396 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume" (OuterVolumeSpecName: "config-volume") pod "61b91d97-5f0f-4255-a8a4-6f35380e779e" (UID: "61b91d97-5f0f-4255-a8a4-6f35380e779e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.761105 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5" (OuterVolumeSpecName: "kube-api-access-rrgc5") pod "61b91d97-5f0f-4255-a8a4-6f35380e779e" (UID: "61b91d97-5f0f-4255-a8a4-6f35380e779e"). InnerVolumeSpecName "kube-api-access-rrgc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.761365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "61b91d97-5f0f-4255-a8a4-6f35380e779e" (UID: "61b91d97-5f0f-4255-a8a4-6f35380e779e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.853831 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/61b91d97-5f0f-4255-a8a4-6f35380e779e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.853902 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/61b91d97-5f0f-4255-a8a4-6f35380e779e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 06:45:03 crc kubenswrapper[4717]: I0308 06:45:03.853930 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrgc5\" (UniqueName: \"kubernetes.io/projected/61b91d97-5f0f-4255-a8a4-6f35380e779e-kube-api-access-rrgc5\") on node \"crc\" DevicePath \"\"" Mar 08 06:45:04 crc kubenswrapper[4717]: I0308 06:45:04.202288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" event={"ID":"61b91d97-5f0f-4255-a8a4-6f35380e779e","Type":"ContainerDied","Data":"ae0736b1a7d778d94c33f43b9ebe1747162cc50cd0aa4adc915c95557a9ce661"} Mar 08 06:45:04 crc kubenswrapper[4717]: I0308 06:45:04.202313 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549205-dtxwq" Mar 08 06:45:04 crc kubenswrapper[4717]: I0308 06:45:04.202323 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0736b1a7d778d94c33f43b9ebe1747162cc50cd0aa4adc915c95557a9ce661" Mar 08 06:45:04 crc kubenswrapper[4717]: I0308 06:45:04.695223 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq"] Mar 08 06:45:04 crc kubenswrapper[4717]: I0308 06:45:04.711593 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549160-v2bfq"] Mar 08 06:45:05 crc kubenswrapper[4717]: I0308 06:45:05.795098 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82fed304-b5c5-403b-9152-5c683be53931" path="/var/lib/kubelet/pods/82fed304-b5c5-403b-9152-5c683be53931/volumes" Mar 08 06:45:39 crc kubenswrapper[4717]: I0308 06:45:39.726552 4717 scope.go:117] "RemoveContainer" containerID="1d4a5ddcbe68fd34fff714ecbd16e34fab97da7dc904b651d56c310e294e8199" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.145593 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549206-bxbdf"] Mar 08 06:46:00 crc kubenswrapper[4717]: E0308 06:46:00.146921 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b91d97-5f0f-4255-a8a4-6f35380e779e" containerName="collect-profiles" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.146944 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b91d97-5f0f-4255-a8a4-6f35380e779e" containerName="collect-profiles" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.147380 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b91d97-5f0f-4255-a8a4-6f35380e779e" containerName="collect-profiles" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.148582 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.152139 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.152728 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.152804 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.171790 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549206-bxbdf"] Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.287839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnsmx\" (UniqueName: \"kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx\") pod \"auto-csr-approver-29549206-bxbdf\" (UID: \"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a\") " pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.389372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnsmx\" (UniqueName: \"kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx\") pod \"auto-csr-approver-29549206-bxbdf\" (UID: \"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a\") " pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.680038 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnsmx\" (UniqueName: \"kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx\") pod \"auto-csr-approver-29549206-bxbdf\" (UID: \"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a\") " pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:00 crc kubenswrapper[4717]: I0308 06:46:00.788889 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:01 crc kubenswrapper[4717]: I0308 06:46:01.273028 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549206-bxbdf"] Mar 08 06:46:01 crc kubenswrapper[4717]: W0308 06:46:01.279607 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea1dca6_bcce_4675_8b6f_ce2cc6a2022a.slice/crio-407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d WatchSource:0}: Error finding container 407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d: Status 404 returned error can't find the container with id 407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d Mar 08 06:46:01 crc kubenswrapper[4717]: I0308 06:46:01.283672 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:46:01 crc kubenswrapper[4717]: I0308 06:46:01.837924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" event={"ID":"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a","Type":"ContainerStarted","Data":"407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d"} Mar 08 06:46:02 crc kubenswrapper[4717]: I0308 06:46:02.849116 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" containerID="ec3fee07b9874b023fb8d1b677d36893e4dfd3e5f150595da7e52911cd2343c9" exitCode=0 Mar 08 06:46:02 crc kubenswrapper[4717]: I0308 06:46:02.849164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" event={"ID":"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a","Type":"ContainerDied","Data":"ec3fee07b9874b023fb8d1b677d36893e4dfd3e5f150595da7e52911cd2343c9"} Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.291392 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.469009 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnsmx\" (UniqueName: \"kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx\") pod \"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a\" (UID: \"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a\") " Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.476313 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx" (OuterVolumeSpecName: "kube-api-access-mnsmx") pod "0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" (UID: "0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a"). InnerVolumeSpecName "kube-api-access-mnsmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.571890 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnsmx\" (UniqueName: \"kubernetes.io/projected/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a-kube-api-access-mnsmx\") on node \"crc\" DevicePath \"\"" Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.879301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" event={"ID":"0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a","Type":"ContainerDied","Data":"407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d"} Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.880099 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="407363a0305e1fed0ec84bb88aaa43857016df637465a0308389faad8162220d" Mar 08 06:46:04 crc kubenswrapper[4717]: I0308 06:46:04.879380 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549206-bxbdf" Mar 08 06:46:05 crc kubenswrapper[4717]: I0308 06:46:05.418806 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549200-zph9l"] Mar 08 06:46:05 crc kubenswrapper[4717]: I0308 06:46:05.428761 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549200-zph9l"] Mar 08 06:46:05 crc kubenswrapper[4717]: I0308 06:46:05.800363 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21870681-96ec-4a38-8f09-a774da82e554" path="/var/lib/kubelet/pods/21870681-96ec-4a38-8f09-a774da82e554/volumes" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.641394 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:22 crc kubenswrapper[4717]: E0308 06:46:22.642354 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" containerName="oc" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.642369 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" containerName="oc" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.642624 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" containerName="oc" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.644348 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.661103 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.703711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9cpt\" (UniqueName: \"kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.703784 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.704070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.805479 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9cpt\" (UniqueName: \"kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.805556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.805644 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.806119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:22 crc kubenswrapper[4717]: I0308 06:46:22.806132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:23 crc kubenswrapper[4717]: I0308 06:46:23.179368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9cpt\" (UniqueName: \"kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt\") pod \"community-operators-zkks8\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:23 crc kubenswrapper[4717]: I0308 06:46:23.293104 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:23 crc kubenswrapper[4717]: I0308 06:46:23.855075 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:23 crc kubenswrapper[4717]: W0308 06:46:23.859463 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c079da_3294_4152_90c8_1f2d7d6dd97b.slice/crio-169e0a6b43ae3a460d2003fd43d5225afe3c2d0811f033d95972a9f6b8d68356 WatchSource:0}: Error finding container 169e0a6b43ae3a460d2003fd43d5225afe3c2d0811f033d95972a9f6b8d68356: Status 404 returned error can't find the container with id 169e0a6b43ae3a460d2003fd43d5225afe3c2d0811f033d95972a9f6b8d68356 Mar 08 06:46:24 crc kubenswrapper[4717]: I0308 06:46:24.085002 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerID="db54aea2f3153fda240b7477f7ae4b36b77c452f605509f313f9f83a4708afc1" exitCode=0 Mar 08 06:46:24 crc kubenswrapper[4717]: I0308 06:46:24.085043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerDied","Data":"db54aea2f3153fda240b7477f7ae4b36b77c452f605509f313f9f83a4708afc1"} Mar 08 06:46:24 crc kubenswrapper[4717]: I0308 06:46:24.085071 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerStarted","Data":"169e0a6b43ae3a460d2003fd43d5225afe3c2d0811f033d95972a9f6b8d68356"} Mar 08 06:46:25 crc kubenswrapper[4717]: I0308 06:46:25.096776 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerStarted","Data":"c6e8ba3a8b3e173f8e92228da2d8ab5db5f521cb4a4e1eb02f4a3327bc8f8952"} Mar 08 06:46:26 crc kubenswrapper[4717]: I0308 06:46:26.114642 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerID="c6e8ba3a8b3e173f8e92228da2d8ab5db5f521cb4a4e1eb02f4a3327bc8f8952" exitCode=0 Mar 08 06:46:26 crc kubenswrapper[4717]: I0308 06:46:26.114735 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerDied","Data":"c6e8ba3a8b3e173f8e92228da2d8ab5db5f521cb4a4e1eb02f4a3327bc8f8952"} Mar 08 06:46:27 crc kubenswrapper[4717]: I0308 06:46:27.127366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerStarted","Data":"37b12cf3fbde0e73d366bc3edc8c10f610163c95c51c83b4019e4c1a2eac0c66"} Mar 08 06:46:27 crc kubenswrapper[4717]: I0308 06:46:27.149706 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zkks8" podStartSLOduration=2.610328407 podStartE2EDuration="5.149657203s" podCreationTimestamp="2026-03-08 06:46:22 +0000 UTC" firstStartedPulling="2026-03-08 06:46:24.087703213 +0000 UTC m=+4811.005352057" lastFinishedPulling="2026-03-08 06:46:26.627032009 +0000 UTC m=+4813.544680853" observedRunningTime="2026-03-08 06:46:27.148302529 +0000 UTC m=+4814.065951373" watchObservedRunningTime="2026-03-08 06:46:27.149657203 +0000 UTC m=+4814.067306057" Mar 08 06:46:33 crc kubenswrapper[4717]: I0308 06:46:33.293473 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:33 crc kubenswrapper[4717]: I0308 06:46:33.294208 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:33 crc kubenswrapper[4717]: I0308 06:46:33.379434 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:34 crc kubenswrapper[4717]: I0308 06:46:34.120382 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:46:34 crc kubenswrapper[4717]: I0308 06:46:34.120933 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:46:34 crc kubenswrapper[4717]: I0308 06:46:34.265796 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:34 crc kubenswrapper[4717]: I0308 06:46:34.326110 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:36 crc kubenswrapper[4717]: I0308 06:46:36.234395 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zkks8" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="registry-server" containerID="cri-o://37b12cf3fbde0e73d366bc3edc8c10f610163c95c51c83b4019e4c1a2eac0c66" gracePeriod=2 Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.245931 4717 generic.go:334] "Generic (PLEG): container finished" podID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerID="37b12cf3fbde0e73d366bc3edc8c10f610163c95c51c83b4019e4c1a2eac0c66" exitCode=0 Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.246117 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerDied","Data":"37b12cf3fbde0e73d366bc3edc8c10f610163c95c51c83b4019e4c1a2eac0c66"} Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.887455 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.943702 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content\") pod \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.943965 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities\") pod \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.944029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9cpt\" (UniqueName: \"kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt\") pod \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\" (UID: \"c1c079da-3294-4152-90c8-1f2d7d6dd97b\") " Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.946543 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities" (OuterVolumeSpecName: "utilities") pod "c1c079da-3294-4152-90c8-1f2d7d6dd97b" (UID: "c1c079da-3294-4152-90c8-1f2d7d6dd97b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:46:37 crc kubenswrapper[4717]: I0308 06:46:37.962762 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt" (OuterVolumeSpecName: "kube-api-access-s9cpt") pod "c1c079da-3294-4152-90c8-1f2d7d6dd97b" (UID: "c1c079da-3294-4152-90c8-1f2d7d6dd97b"). InnerVolumeSpecName "kube-api-access-s9cpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.020857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1c079da-3294-4152-90c8-1f2d7d6dd97b" (UID: "c1c079da-3294-4152-90c8-1f2d7d6dd97b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.046647 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9cpt\" (UniqueName: \"kubernetes.io/projected/c1c079da-3294-4152-90c8-1f2d7d6dd97b-kube-api-access-s9cpt\") on node \"crc\" DevicePath \"\"" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.046675 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.046714 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1c079da-3294-4152-90c8-1f2d7d6dd97b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.259275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkks8" event={"ID":"c1c079da-3294-4152-90c8-1f2d7d6dd97b","Type":"ContainerDied","Data":"169e0a6b43ae3a460d2003fd43d5225afe3c2d0811f033d95972a9f6b8d68356"} Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.259333 4717 scope.go:117] "RemoveContainer" containerID="37b12cf3fbde0e73d366bc3edc8c10f610163c95c51c83b4019e4c1a2eac0c66" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.259485 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkks8" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.289172 4717 scope.go:117] "RemoveContainer" containerID="c6e8ba3a8b3e173f8e92228da2d8ab5db5f521cb4a4e1eb02f4a3327bc8f8952" Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.318829 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.324904 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zkks8"] Mar 08 06:46:38 crc kubenswrapper[4717]: I0308 06:46:38.341070 4717 scope.go:117] "RemoveContainer" containerID="db54aea2f3153fda240b7477f7ae4b36b77c452f605509f313f9f83a4708afc1" Mar 08 06:46:39 crc kubenswrapper[4717]: I0308 06:46:39.797597 4717 scope.go:117] "RemoveContainer" containerID="67464bcc1413496b67a7c4deafa90eb6a6125f6f4703ee0099db97108ca8fa1a" Mar 08 06:46:39 crc kubenswrapper[4717]: I0308 06:46:39.800547 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" path="/var/lib/kubelet/pods/c1c079da-3294-4152-90c8-1f2d7d6dd97b/volumes" Mar 08 06:47:04 crc kubenswrapper[4717]: I0308 06:47:04.120260 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:47:04 crc kubenswrapper[4717]: I0308 06:47:04.120920 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.120177 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.120893 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.120953 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.121913 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.121994 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8" gracePeriod=600 Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.983496 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8" exitCode=0 Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.983542 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8"} Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.984167 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2"} Mar 08 06:47:34 crc kubenswrapper[4717]: I0308 06:47:34.984186 4717 scope.go:117] "RemoveContainer" containerID="113b1413e3fc9e2f8c394fc0546139006e890d0363a3cd890fdf32808206c9ce" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.159902 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549208-7zc26"] Mar 08 06:48:00 crc kubenswrapper[4717]: E0308 06:48:00.160870 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="registry-server" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.160886 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="registry-server" Mar 08 06:48:00 crc kubenswrapper[4717]: E0308 06:48:00.160912 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="extract-content" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.160920 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="extract-content" Mar 08 06:48:00 crc kubenswrapper[4717]: E0308 06:48:00.160949 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="extract-utilities" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.160957 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="extract-utilities" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.161202 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c079da-3294-4152-90c8-1f2d7d6dd97b" containerName="registry-server" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.162224 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.166233 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.166342 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.166440 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.175101 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549208-7zc26"] Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.238293 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cnj\" (UniqueName: \"kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj\") pod \"auto-csr-approver-29549208-7zc26\" (UID: \"e17c651a-fec7-4506-af80-32caddc1fb76\") " pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.341270 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cnj\" (UniqueName: \"kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj\") pod \"auto-csr-approver-29549208-7zc26\" (UID: \"e17c651a-fec7-4506-af80-32caddc1fb76\") " pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.361089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cnj\" (UniqueName: \"kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj\") pod \"auto-csr-approver-29549208-7zc26\" (UID: \"e17c651a-fec7-4506-af80-32caddc1fb76\") " pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.483184 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:00 crc kubenswrapper[4717]: I0308 06:48:00.928608 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549208-7zc26"] Mar 08 06:48:01 crc kubenswrapper[4717]: I0308 06:48:01.233593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549208-7zc26" event={"ID":"e17c651a-fec7-4506-af80-32caddc1fb76","Type":"ContainerStarted","Data":"7d28819d31677e783b9c6470640db823d1ef284e66a664f88340dd9c9084f6d0"} Mar 08 06:48:03 crc kubenswrapper[4717]: I0308 06:48:03.255389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549208-7zc26" event={"ID":"e17c651a-fec7-4506-af80-32caddc1fb76","Type":"ContainerStarted","Data":"46c0b0f0d18c747ad4d4aca672d8b289cb8ae0c150c1b54aeba9fd3f44f43f12"} Mar 08 06:48:03 crc kubenswrapper[4717]: I0308 06:48:03.272186 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549208-7zc26" podStartSLOduration=1.382136981 podStartE2EDuration="3.272169262s" podCreationTimestamp="2026-03-08 06:48:00 +0000 UTC" firstStartedPulling="2026-03-08 06:48:00.931928505 +0000 UTC m=+4907.849577349" lastFinishedPulling="2026-03-08 06:48:02.821960786 +0000 UTC m=+4909.739609630" observedRunningTime="2026-03-08 06:48:03.266970564 +0000 UTC m=+4910.184619418" watchObservedRunningTime="2026-03-08 06:48:03.272169262 +0000 UTC m=+4910.189818106" Mar 08 06:48:04 crc kubenswrapper[4717]: I0308 06:48:04.266158 4717 generic.go:334] "Generic (PLEG): container finished" podID="e17c651a-fec7-4506-af80-32caddc1fb76" containerID="46c0b0f0d18c747ad4d4aca672d8b289cb8ae0c150c1b54aeba9fd3f44f43f12" exitCode=0 Mar 08 06:48:04 crc kubenswrapper[4717]: I0308 06:48:04.266270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549208-7zc26" event={"ID":"e17c651a-fec7-4506-af80-32caddc1fb76","Type":"ContainerDied","Data":"46c0b0f0d18c747ad4d4aca672d8b289cb8ae0c150c1b54aeba9fd3f44f43f12"} Mar 08 06:48:05 crc kubenswrapper[4717]: I0308 06:48:05.759910 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:05 crc kubenswrapper[4717]: I0308 06:48:05.958018 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cnj\" (UniqueName: \"kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj\") pod \"e17c651a-fec7-4506-af80-32caddc1fb76\" (UID: \"e17c651a-fec7-4506-af80-32caddc1fb76\") " Mar 08 06:48:05 crc kubenswrapper[4717]: I0308 06:48:05.965343 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj" (OuterVolumeSpecName: "kube-api-access-m7cnj") pod "e17c651a-fec7-4506-af80-32caddc1fb76" (UID: "e17c651a-fec7-4506-af80-32caddc1fb76"). InnerVolumeSpecName "kube-api-access-m7cnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.061134 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cnj\" (UniqueName: \"kubernetes.io/projected/e17c651a-fec7-4506-af80-32caddc1fb76-kube-api-access-m7cnj\") on node \"crc\" DevicePath \"\"" Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.343562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549208-7zc26" event={"ID":"e17c651a-fec7-4506-af80-32caddc1fb76","Type":"ContainerDied","Data":"7d28819d31677e783b9c6470640db823d1ef284e66a664f88340dd9c9084f6d0"} Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.343926 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d28819d31677e783b9c6470640db823d1ef284e66a664f88340dd9c9084f6d0" Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.343755 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549208-7zc26" Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.356401 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549202-wcbhf"] Mar 08 06:48:06 crc kubenswrapper[4717]: I0308 06:48:06.366737 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549202-wcbhf"] Mar 08 06:48:07 crc kubenswrapper[4717]: I0308 06:48:07.796768 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683" path="/var/lib/kubelet/pods/d0eb71a1-b6ce-4ad6-b2d3-99d1b66b6683/volumes" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.045613 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:20 crc kubenswrapper[4717]: E0308 06:48:20.046974 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17c651a-fec7-4506-af80-32caddc1fb76" containerName="oc" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.046990 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17c651a-fec7-4506-af80-32caddc1fb76" containerName="oc" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.047259 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17c651a-fec7-4506-af80-32caddc1fb76" containerName="oc" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.048735 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.063022 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.153057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.153287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dfg\" (UniqueName: \"kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.153832 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.255543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.255647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.255739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dfg\" (UniqueName: \"kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.256256 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.256370 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.275917 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dfg\" (UniqueName: \"kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg\") pod \"certified-operators-mwdvf\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:20 crc kubenswrapper[4717]: I0308 06:48:20.395245 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:21 crc kubenswrapper[4717]: I0308 06:48:21.006614 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:21 crc kubenswrapper[4717]: I0308 06:48:21.493266 4717 generic.go:334] "Generic (PLEG): container finished" podID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerID="1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2" exitCode=0 Mar 08 06:48:21 crc kubenswrapper[4717]: I0308 06:48:21.493318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerDied","Data":"1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2"} Mar 08 06:48:21 crc kubenswrapper[4717]: I0308 06:48:21.493550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerStarted","Data":"ea6441317d5627afb9e88a7e4d0214bb82843dd7b4386d648f501a1c935e3b19"} Mar 08 06:48:23 crc kubenswrapper[4717]: I0308 06:48:23.514429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerStarted","Data":"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1"} Mar 08 06:48:24 crc kubenswrapper[4717]: I0308 06:48:24.527113 4717 generic.go:334] "Generic (PLEG): container finished" podID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerID="ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1" exitCode=0 Mar 08 06:48:24 crc kubenswrapper[4717]: I0308 06:48:24.527200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerDied","Data":"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1"} Mar 08 06:48:25 crc kubenswrapper[4717]: I0308 06:48:25.537374 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerStarted","Data":"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8"} Mar 08 06:48:25 crc kubenswrapper[4717]: I0308 06:48:25.556125 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwdvf" podStartSLOduration=1.917029578 podStartE2EDuration="5.55611133s" podCreationTimestamp="2026-03-08 06:48:20 +0000 UTC" firstStartedPulling="2026-03-08 06:48:21.495077096 +0000 UTC m=+4928.412725930" lastFinishedPulling="2026-03-08 06:48:25.134158838 +0000 UTC m=+4932.051807682" observedRunningTime="2026-03-08 06:48:25.552919322 +0000 UTC m=+4932.470568166" watchObservedRunningTime="2026-03-08 06:48:25.55611133 +0000 UTC m=+4932.473760164" Mar 08 06:48:30 crc kubenswrapper[4717]: I0308 06:48:30.395803 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:30 crc kubenswrapper[4717]: I0308 06:48:30.396487 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:30 crc kubenswrapper[4717]: I0308 06:48:30.466869 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:30 crc kubenswrapper[4717]: I0308 06:48:30.639264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:30 crc kubenswrapper[4717]: I0308 06:48:30.703117 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:32 crc kubenswrapper[4717]: I0308 06:48:32.608537 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwdvf" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="registry-server" containerID="cri-o://8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8" gracePeriod=2 Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.109099 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.187823 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content\") pod \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.188081 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dfg\" (UniqueName: \"kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg\") pod \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.188111 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities\") pod \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\" (UID: \"3387ec82-ef5b-4fa8-bb69-60000a9ac72b\") " Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.191759 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities" (OuterVolumeSpecName: "utilities") pod "3387ec82-ef5b-4fa8-bb69-60000a9ac72b" (UID: "3387ec82-ef5b-4fa8-bb69-60000a9ac72b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.212627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg" (OuterVolumeSpecName: "kube-api-access-q7dfg") pod "3387ec82-ef5b-4fa8-bb69-60000a9ac72b" (UID: "3387ec82-ef5b-4fa8-bb69-60000a9ac72b"). InnerVolumeSpecName "kube-api-access-q7dfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.272346 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3387ec82-ef5b-4fa8-bb69-60000a9ac72b" (UID: "3387ec82-ef5b-4fa8-bb69-60000a9ac72b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.294003 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dfg\" (UniqueName: \"kubernetes.io/projected/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-kube-api-access-q7dfg\") on node \"crc\" DevicePath \"\"" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.294046 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.294062 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3387ec82-ef5b-4fa8-bb69-60000a9ac72b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.618624 4717 generic.go:334] "Generic (PLEG): container finished" podID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerID="8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8" exitCode=0 Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.618821 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerDied","Data":"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8"} Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.619050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwdvf" event={"ID":"3387ec82-ef5b-4fa8-bb69-60000a9ac72b","Type":"ContainerDied","Data":"ea6441317d5627afb9e88a7e4d0214bb82843dd7b4386d648f501a1c935e3b19"} Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.619078 4717 scope.go:117] "RemoveContainer" containerID="8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.618932 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwdvf" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.644101 4717 scope.go:117] "RemoveContainer" containerID="ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.676816 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.687074 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwdvf"] Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.688911 4717 scope.go:117] "RemoveContainer" containerID="1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.735842 4717 scope.go:117] "RemoveContainer" containerID="8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8" Mar 08 06:48:33 crc kubenswrapper[4717]: E0308 06:48:33.736275 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8\": container with ID starting with 8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8 not found: ID does not exist" containerID="8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.736333 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8"} err="failed to get container status \"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8\": rpc error: code = NotFound desc = could not find container \"8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8\": container with ID starting with 8142d5e4c9af360f7b4a2ac54d8d483e798defa2f2cf1de824c3292ae266e9f8 not found: ID does not exist" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.736365 4717 scope.go:117] "RemoveContainer" containerID="ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1" Mar 08 06:48:33 crc kubenswrapper[4717]: E0308 06:48:33.736853 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1\": container with ID starting with ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1 not found: ID does not exist" containerID="ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.736901 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1"} err="failed to get container status \"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1\": rpc error: code = NotFound desc = could not find container \"ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1\": container with ID starting with ed8f4fca2a1323f06908366ef5df74f19ab89929650300e6365a5265fe4f83f1 not found: ID does not exist" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.736935 4717 scope.go:117] "RemoveContainer" containerID="1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2" Mar 08 06:48:33 crc kubenswrapper[4717]: E0308 06:48:33.737268 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2\": container with ID starting with 1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2 not found: ID does not exist" containerID="1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.737335 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2"} err="failed to get container status \"1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2\": rpc error: code = NotFound desc = could not find container \"1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2\": container with ID starting with 1767e68d8416c45c19899a7ea7a03a15a346bdcd12a3ed5b061ace581723a7a2 not found: ID does not exist" Mar 08 06:48:33 crc kubenswrapper[4717]: I0308 06:48:33.791906 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" path="/var/lib/kubelet/pods/3387ec82-ef5b-4fa8-bb69-60000a9ac72b/volumes" Mar 08 06:48:39 crc kubenswrapper[4717]: I0308 06:48:39.941943 4717 scope.go:117] "RemoveContainer" containerID="76cd506abf9d822f053d10014620a16c9d63f125d5157645a40efa080fa682b9" Mar 08 06:48:51 crc kubenswrapper[4717]: E0308 06:48:51.804924 4717 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.44:38874->38.102.83.44:42899: read tcp 38.102.83.44:38874->38.102.83.44:42899: read: connection reset by peer Mar 08 06:49:34 crc kubenswrapper[4717]: I0308 06:49:34.120011 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:49:34 crc kubenswrapper[4717]: I0308 06:49:34.120573 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.165767 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549210-pgr86"] Mar 08 06:50:00 crc kubenswrapper[4717]: E0308 06:50:00.167161 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="extract-utilities" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.167184 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="extract-utilities" Mar 08 06:50:00 crc kubenswrapper[4717]: E0308 06:50:00.167224 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="extract-content" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.167237 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="extract-content" Mar 08 06:50:00 crc kubenswrapper[4717]: E0308 06:50:00.167258 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="registry-server" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.167294 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="registry-server" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.167668 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3387ec82-ef5b-4fa8-bb69-60000a9ac72b" containerName="registry-server" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.168786 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.172248 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.172484 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.172648 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.182301 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549210-pgr86"] Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.229433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkvd\" (UniqueName: \"kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd\") pod \"auto-csr-approver-29549210-pgr86\" (UID: \"957225ec-c79c-4068-ad77-76c63b57d468\") " pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.331547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkvd\" (UniqueName: \"kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd\") pod \"auto-csr-approver-29549210-pgr86\" (UID: \"957225ec-c79c-4068-ad77-76c63b57d468\") " pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.356065 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkvd\" (UniqueName: \"kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd\") pod \"auto-csr-approver-29549210-pgr86\" (UID: \"957225ec-c79c-4068-ad77-76c63b57d468\") " pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:00 crc kubenswrapper[4717]: I0308 06:50:00.524871 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:01 crc kubenswrapper[4717]: I0308 06:50:01.047895 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549210-pgr86"] Mar 08 06:50:01 crc kubenswrapper[4717]: I0308 06:50:01.614675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549210-pgr86" event={"ID":"957225ec-c79c-4068-ad77-76c63b57d468","Type":"ContainerStarted","Data":"9e6521a74582ba69a4224a561b3d2f6b26bf0af8ce5a44937cec4880cf333a9b"} Mar 08 06:50:02 crc kubenswrapper[4717]: I0308 06:50:02.626892 4717 generic.go:334] "Generic (PLEG): container finished" podID="957225ec-c79c-4068-ad77-76c63b57d468" containerID="c5b4c2a9081f8c4abdf30cc565b88042590932bef05d4198d7e29250d9b40ccf" exitCode=0 Mar 08 06:50:02 crc kubenswrapper[4717]: I0308 06:50:02.626965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549210-pgr86" event={"ID":"957225ec-c79c-4068-ad77-76c63b57d468","Type":"ContainerDied","Data":"c5b4c2a9081f8c4abdf30cc565b88042590932bef05d4198d7e29250d9b40ccf"} Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.119996 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.120542 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.125072 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.210989 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hkvd\" (UniqueName: \"kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd\") pod \"957225ec-c79c-4068-ad77-76c63b57d468\" (UID: \"957225ec-c79c-4068-ad77-76c63b57d468\") " Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.216083 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd" (OuterVolumeSpecName: "kube-api-access-2hkvd") pod "957225ec-c79c-4068-ad77-76c63b57d468" (UID: "957225ec-c79c-4068-ad77-76c63b57d468"). InnerVolumeSpecName "kube-api-access-2hkvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.312879 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hkvd\" (UniqueName: \"kubernetes.io/projected/957225ec-c79c-4068-ad77-76c63b57d468-kube-api-access-2hkvd\") on node \"crc\" DevicePath \"\"" Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.650919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549210-pgr86" event={"ID":"957225ec-c79c-4068-ad77-76c63b57d468","Type":"ContainerDied","Data":"9e6521a74582ba69a4224a561b3d2f6b26bf0af8ce5a44937cec4880cf333a9b"} Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.651281 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6521a74582ba69a4224a561b3d2f6b26bf0af8ce5a44937cec4880cf333a9b" Mar 08 06:50:04 crc kubenswrapper[4717]: I0308 06:50:04.651035 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549210-pgr86" Mar 08 06:50:05 crc kubenswrapper[4717]: I0308 06:50:05.207723 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549204-kbtdp"] Mar 08 06:50:05 crc kubenswrapper[4717]: I0308 06:50:05.219081 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549204-kbtdp"] Mar 08 06:50:05 crc kubenswrapper[4717]: I0308 06:50:05.805434 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca54edc-7e3d-4971-9146-fd74a53edd2e" path="/var/lib/kubelet/pods/bca54edc-7e3d-4971-9146-fd74a53edd2e/volumes" Mar 08 06:50:34 crc kubenswrapper[4717]: I0308 06:50:34.120207 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:50:34 crc kubenswrapper[4717]: I0308 06:50:34.120775 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:50:34 crc kubenswrapper[4717]: I0308 06:50:34.120834 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:50:34 crc kubenswrapper[4717]: I0308 06:50:34.121530 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:50:34 crc kubenswrapper[4717]: I0308 06:50:34.121582 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" gracePeriod=600 Mar 08 06:50:34 crc kubenswrapper[4717]: E0308 06:50:34.261079 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:50:35 crc kubenswrapper[4717]: I0308 06:50:35.003938 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" exitCode=0 Mar 08 06:50:35 crc kubenswrapper[4717]: I0308 06:50:35.004016 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2"} Mar 08 06:50:35 crc kubenswrapper[4717]: I0308 06:50:35.004781 4717 scope.go:117] "RemoveContainer" containerID="55cc7c8c0514abf1eb01c4c322712d8ae92f291c3cfe44c57de8c0a5bbe7e1f8" Mar 08 06:50:35 crc kubenswrapper[4717]: I0308 06:50:35.005801 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:50:35 crc kubenswrapper[4717]: E0308 06:50:35.006269 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:50:40 crc kubenswrapper[4717]: I0308 06:50:40.201531 4717 scope.go:117] "RemoveContainer" containerID="6faf5b28bc81deaeea4c7701f6ec7abd56f3002c7ae6106616e2598406997bb0" Mar 08 06:50:45 crc kubenswrapper[4717]: I0308 06:50:45.782487 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:50:45 crc kubenswrapper[4717]: E0308 06:50:45.783836 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:50:59 crc kubenswrapper[4717]: I0308 06:50:59.782647 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:50:59 crc kubenswrapper[4717]: E0308 06:50:59.783807 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:51:13 crc kubenswrapper[4717]: I0308 06:51:13.789989 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:51:13 crc kubenswrapper[4717]: E0308 06:51:13.790620 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:51:25 crc kubenswrapper[4717]: I0308 06:51:25.782910 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:51:25 crc kubenswrapper[4717]: E0308 06:51:25.784252 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:51:40 crc kubenswrapper[4717]: I0308 06:51:40.782922 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:51:40 crc kubenswrapper[4717]: E0308 06:51:40.783837 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:51:55 crc kubenswrapper[4717]: I0308 06:51:55.781523 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:51:55 crc kubenswrapper[4717]: E0308 06:51:55.782430 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.171178 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549212-9jsnh"] Mar 08 06:52:00 crc kubenswrapper[4717]: E0308 06:52:00.174366 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957225ec-c79c-4068-ad77-76c63b57d468" containerName="oc" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.174587 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="957225ec-c79c-4068-ad77-76c63b57d468" containerName="oc" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.175442 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="957225ec-c79c-4068-ad77-76c63b57d468" containerName="oc" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.177220 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.179793 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.181581 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.182171 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.200284 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549212-9jsnh"] Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.330371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf28q\" (UniqueName: \"kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q\") pod \"auto-csr-approver-29549212-9jsnh\" (UID: \"c04f5428-7ebe-4ec3-b9aa-07c6222ff270\") " pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.434900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf28q\" (UniqueName: \"kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q\") pod \"auto-csr-approver-29549212-9jsnh\" (UID: \"c04f5428-7ebe-4ec3-b9aa-07c6222ff270\") " pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.457790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf28q\" (UniqueName: \"kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q\") pod \"auto-csr-approver-29549212-9jsnh\" (UID: \"c04f5428-7ebe-4ec3-b9aa-07c6222ff270\") " pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:00 crc kubenswrapper[4717]: I0308 06:52:00.515553 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:01 crc kubenswrapper[4717]: I0308 06:52:01.074246 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549212-9jsnh"] Mar 08 06:52:01 crc kubenswrapper[4717]: I0308 06:52:01.286077 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:52:02 crc kubenswrapper[4717]: I0308 06:52:02.240355 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" event={"ID":"c04f5428-7ebe-4ec3-b9aa-07c6222ff270","Type":"ContainerStarted","Data":"19c89e5b5d121f60c9851ef2444acda0abd370a7a84cf8bfafb78d27aecb14f7"} Mar 08 06:52:03 crc kubenswrapper[4717]: I0308 06:52:03.260288 4717 generic.go:334] "Generic (PLEG): container finished" podID="c04f5428-7ebe-4ec3-b9aa-07c6222ff270" containerID="975d5dcf0cfbc7dbc35c3ca4c7d2b0d4e8bd14cfaf3dee0ec6acf3cbed0230ba" exitCode=0 Mar 08 06:52:03 crc kubenswrapper[4717]: I0308 06:52:03.260600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" event={"ID":"c04f5428-7ebe-4ec3-b9aa-07c6222ff270","Type":"ContainerDied","Data":"975d5dcf0cfbc7dbc35c3ca4c7d2b0d4e8bd14cfaf3dee0ec6acf3cbed0230ba"} Mar 08 06:52:04 crc kubenswrapper[4717]: I0308 06:52:04.773361 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:04 crc kubenswrapper[4717]: I0308 06:52:04.848744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf28q\" (UniqueName: \"kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q\") pod \"c04f5428-7ebe-4ec3-b9aa-07c6222ff270\" (UID: \"c04f5428-7ebe-4ec3-b9aa-07c6222ff270\") " Mar 08 06:52:04 crc kubenswrapper[4717]: I0308 06:52:04.857858 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q" (OuterVolumeSpecName: "kube-api-access-kf28q") pod "c04f5428-7ebe-4ec3-b9aa-07c6222ff270" (UID: "c04f5428-7ebe-4ec3-b9aa-07c6222ff270"). InnerVolumeSpecName "kube-api-access-kf28q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:52:04 crc kubenswrapper[4717]: I0308 06:52:04.951207 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf28q\" (UniqueName: \"kubernetes.io/projected/c04f5428-7ebe-4ec3-b9aa-07c6222ff270-kube-api-access-kf28q\") on node \"crc\" DevicePath \"\"" Mar 08 06:52:05 crc kubenswrapper[4717]: I0308 06:52:05.283808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" event={"ID":"c04f5428-7ebe-4ec3-b9aa-07c6222ff270","Type":"ContainerDied","Data":"19c89e5b5d121f60c9851ef2444acda0abd370a7a84cf8bfafb78d27aecb14f7"} Mar 08 06:52:05 crc kubenswrapper[4717]: I0308 06:52:05.284123 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19c89e5b5d121f60c9851ef2444acda0abd370a7a84cf8bfafb78d27aecb14f7" Mar 08 06:52:05 crc kubenswrapper[4717]: I0308 06:52:05.284123 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549212-9jsnh" Mar 08 06:52:05 crc kubenswrapper[4717]: I0308 06:52:05.849126 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549206-bxbdf"] Mar 08 06:52:05 crc kubenswrapper[4717]: I0308 06:52:05.857973 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549206-bxbdf"] Mar 08 06:52:07 crc kubenswrapper[4717]: I0308 06:52:07.811173 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a" path="/var/lib/kubelet/pods/0ea1dca6-bcce-4675-8b6f-ce2cc6a2022a/volumes" Mar 08 06:52:08 crc kubenswrapper[4717]: I0308 06:52:08.782112 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:52:08 crc kubenswrapper[4717]: E0308 06:52:08.782904 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:52:20 crc kubenswrapper[4717]: I0308 06:52:20.781895 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:52:20 crc kubenswrapper[4717]: E0308 06:52:20.782953 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:52:32 crc kubenswrapper[4717]: I0308 06:52:32.781732 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:52:32 crc kubenswrapper[4717]: E0308 06:52:32.782577 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:52:40 crc kubenswrapper[4717]: I0308 06:52:40.358186 4717 scope.go:117] "RemoveContainer" containerID="ec3fee07b9874b023fb8d1b677d36893e4dfd3e5f150595da7e52911cd2343c9" Mar 08 06:52:43 crc kubenswrapper[4717]: I0308 06:52:43.791722 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:52:43 crc kubenswrapper[4717]: E0308 06:52:43.792848 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:52:55 crc kubenswrapper[4717]: I0308 06:52:55.781659 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:52:55 crc kubenswrapper[4717]: E0308 06:52:55.782780 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:53:10 crc kubenswrapper[4717]: I0308 06:53:10.783255 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:53:10 crc kubenswrapper[4717]: E0308 06:53:10.784233 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:53:23 crc kubenswrapper[4717]: I0308 06:53:23.792177 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:53:23 crc kubenswrapper[4717]: E0308 06:53:23.793028 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.065794 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:34 crc kubenswrapper[4717]: E0308 06:53:34.067386 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04f5428-7ebe-4ec3-b9aa-07c6222ff270" containerName="oc" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.067405 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04f5428-7ebe-4ec3-b9aa-07c6222ff270" containerName="oc" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.067723 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04f5428-7ebe-4ec3-b9aa-07c6222ff270" containerName="oc" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.070099 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.082964 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.135786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.135841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.135873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.238081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.238124 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.238156 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.238656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.240351 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.263109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9\") pod \"redhat-marketplace-5cpdt\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.449851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:34 crc kubenswrapper[4717]: I0308 06:53:34.781984 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:53:34 crc kubenswrapper[4717]: E0308 06:53:34.782569 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:53:35 crc kubenswrapper[4717]: I0308 06:53:35.030936 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:35 crc kubenswrapper[4717]: I0308 06:53:35.278078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerStarted","Data":"136f89eb833adcfaf622f5186d20ae48f46a3b68382f1327de7bff5179e1139e"} Mar 08 06:53:36 crc kubenswrapper[4717]: I0308 06:53:36.289136 4717 generic.go:334] "Generic (PLEG): container finished" podID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerID="a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085" exitCode=0 Mar 08 06:53:36 crc kubenswrapper[4717]: I0308 06:53:36.289211 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerDied","Data":"a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085"} Mar 08 06:53:37 crc kubenswrapper[4717]: I0308 06:53:37.304054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerStarted","Data":"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc"} Mar 08 06:53:38 crc kubenswrapper[4717]: I0308 06:53:38.317154 4717 generic.go:334] "Generic (PLEG): container finished" podID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerID="4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc" exitCode=0 Mar 08 06:53:38 crc kubenswrapper[4717]: I0308 06:53:38.317221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerDied","Data":"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc"} Mar 08 06:53:39 crc kubenswrapper[4717]: I0308 06:53:39.345851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerStarted","Data":"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194"} Mar 08 06:53:39 crc kubenswrapper[4717]: I0308 06:53:39.371607 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5cpdt" podStartSLOduration=2.917452147 podStartE2EDuration="5.371585791s" podCreationTimestamp="2026-03-08 06:53:34 +0000 UTC" firstStartedPulling="2026-03-08 06:53:36.291904386 +0000 UTC m=+5243.209553230" lastFinishedPulling="2026-03-08 06:53:38.74603803 +0000 UTC m=+5245.663686874" observedRunningTime="2026-03-08 06:53:39.367419429 +0000 UTC m=+5246.285068273" watchObservedRunningTime="2026-03-08 06:53:39.371585791 +0000 UTC m=+5246.289234655" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.377558 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.380334 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.400443 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.484770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggznt\" (UniqueName: \"kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.484867 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.484917 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.587642 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggznt\" (UniqueName: \"kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.587851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.587898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.588497 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.588504 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.621826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggznt\" (UniqueName: \"kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt\") pod \"redhat-operators-mwgrx\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:41 crc kubenswrapper[4717]: I0308 06:53:41.704894 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:53:42 crc kubenswrapper[4717]: I0308 06:53:42.193408 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:53:42 crc kubenswrapper[4717]: W0308 06:53:42.194300 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22a2522_9010_4199_b02c_c583d518faa3.slice/crio-c27737ef36af4b7a2466a2c72e779baa3b48f7f82cd8e0d2fb4568c7f99cdba7 WatchSource:0}: Error finding container c27737ef36af4b7a2466a2c72e779baa3b48f7f82cd8e0d2fb4568c7f99cdba7: Status 404 returned error can't find the container with id c27737ef36af4b7a2466a2c72e779baa3b48f7f82cd8e0d2fb4568c7f99cdba7 Mar 08 06:53:42 crc kubenswrapper[4717]: I0308 06:53:42.377151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerStarted","Data":"c27737ef36af4b7a2466a2c72e779baa3b48f7f82cd8e0d2fb4568c7f99cdba7"} Mar 08 06:53:43 crc kubenswrapper[4717]: I0308 06:53:43.391293 4717 generic.go:334] "Generic (PLEG): container finished" podID="a22a2522-9010-4199-b02c-c583d518faa3" containerID="773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95" exitCode=0 Mar 08 06:53:43 crc kubenswrapper[4717]: I0308 06:53:43.391342 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerDied","Data":"773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95"} Mar 08 06:53:44 crc kubenswrapper[4717]: I0308 06:53:44.450807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:44 crc kubenswrapper[4717]: I0308 06:53:44.451189 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:44 crc kubenswrapper[4717]: I0308 06:53:44.510146 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:45 crc kubenswrapper[4717]: I0308 06:53:45.410614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerStarted","Data":"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7"} Mar 08 06:53:45 crc kubenswrapper[4717]: I0308 06:53:45.462341 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:47 crc kubenswrapper[4717]: I0308 06:53:47.568371 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:47 crc kubenswrapper[4717]: I0308 06:53:47.568878 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5cpdt" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="registry-server" containerID="cri-o://a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194" gracePeriod=2 Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.067793 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.242736 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities\") pod \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.243039 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content\") pod \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.243222 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9\") pod \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\" (UID: \"6221c8b8-3840-47d6-afaa-ac5d3a315e3a\") " Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.245809 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities" (OuterVolumeSpecName: "utilities") pod "6221c8b8-3840-47d6-afaa-ac5d3a315e3a" (UID: "6221c8b8-3840-47d6-afaa-ac5d3a315e3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.254202 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9" (OuterVolumeSpecName: "kube-api-access-7tkb9") pod "6221c8b8-3840-47d6-afaa-ac5d3a315e3a" (UID: "6221c8b8-3840-47d6-afaa-ac5d3a315e3a"). InnerVolumeSpecName "kube-api-access-7tkb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.267450 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6221c8b8-3840-47d6-afaa-ac5d3a315e3a" (UID: "6221c8b8-3840-47d6-afaa-ac5d3a315e3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.346301 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-kube-api-access-7tkb9\") on node \"crc\" DevicePath \"\"" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.346335 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.346345 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6221c8b8-3840-47d6-afaa-ac5d3a315e3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.445201 4717 generic.go:334] "Generic (PLEG): container finished" podID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerID="a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194" exitCode=0 Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.445276 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5cpdt" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.445292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerDied","Data":"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194"} Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.445661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5cpdt" event={"ID":"6221c8b8-3840-47d6-afaa-ac5d3a315e3a","Type":"ContainerDied","Data":"136f89eb833adcfaf622f5186d20ae48f46a3b68382f1327de7bff5179e1139e"} Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.445732 4717 scope.go:117] "RemoveContainer" containerID="a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.465429 4717 scope.go:117] "RemoveContainer" containerID="4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.483582 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.490751 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5cpdt"] Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.494586 4717 scope.go:117] "RemoveContainer" containerID="a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.546751 4717 scope.go:117] "RemoveContainer" containerID="a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194" Mar 08 06:53:48 crc kubenswrapper[4717]: E0308 06:53:48.547378 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194\": container with ID starting with a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194 not found: ID does not exist" containerID="a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.547420 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194"} err="failed to get container status \"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194\": rpc error: code = NotFound desc = could not find container \"a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194\": container with ID starting with a8ce6ec7204febec0e704b0936c78065ef145aaf17c864772a53c09a2f318194 not found: ID does not exist" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.547442 4717 scope.go:117] "RemoveContainer" containerID="4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc" Mar 08 06:53:48 crc kubenswrapper[4717]: E0308 06:53:48.547937 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc\": container with ID starting with 4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc not found: ID does not exist" containerID="4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.547986 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc"} err="failed to get container status \"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc\": rpc error: code = NotFound desc = could not find container \"4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc\": container with ID starting with 4e7280a424646d26b71dea23f98ca8b0584cdfd14b6ab6ed453dc843d0b809cc not found: ID does not exist" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.548017 4717 scope.go:117] "RemoveContainer" containerID="a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085" Mar 08 06:53:48 crc kubenswrapper[4717]: E0308 06:53:48.548364 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085\": container with ID starting with a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085 not found: ID does not exist" containerID="a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085" Mar 08 06:53:48 crc kubenswrapper[4717]: I0308 06:53:48.548391 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085"} err="failed to get container status \"a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085\": rpc error: code = NotFound desc = could not find container \"a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085\": container with ID starting with a71ee9cfe614da4a751be3f8d393ecc982f6973e61da12b5790a812f9d248085 not found: ID does not exist" Mar 08 06:53:49 crc kubenswrapper[4717]: I0308 06:53:49.782366 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:53:49 crc kubenswrapper[4717]: E0308 06:53:49.783007 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:53:49 crc kubenswrapper[4717]: I0308 06:53:49.794274 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" path="/var/lib/kubelet/pods/6221c8b8-3840-47d6-afaa-ac5d3a315e3a/volumes" Mar 08 06:53:51 crc kubenswrapper[4717]: I0308 06:53:51.487446 4717 generic.go:334] "Generic (PLEG): container finished" podID="a22a2522-9010-4199-b02c-c583d518faa3" containerID="97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7" exitCode=0 Mar 08 06:53:51 crc kubenswrapper[4717]: I0308 06:53:51.487536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerDied","Data":"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7"} Mar 08 06:53:52 crc kubenswrapper[4717]: I0308 06:53:52.500847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerStarted","Data":"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e"} Mar 08 06:53:52 crc kubenswrapper[4717]: I0308 06:53:52.535473 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mwgrx" podStartSLOduration=2.976245774 podStartE2EDuration="11.535447785s" podCreationTimestamp="2026-03-08 06:53:41 +0000 UTC" firstStartedPulling="2026-03-08 06:53:43.395408941 +0000 UTC m=+5250.313057795" lastFinishedPulling="2026-03-08 06:53:51.954610932 +0000 UTC m=+5258.872259806" observedRunningTime="2026-03-08 06:53:52.524309441 +0000 UTC m=+5259.441958285" watchObservedRunningTime="2026-03-08 06:53:52.535447785 +0000 UTC m=+5259.453096659" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.142702 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549214-sz8rk"] Mar 08 06:54:00 crc kubenswrapper[4717]: E0308 06:54:00.143610 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="registry-server" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.143622 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="registry-server" Mar 08 06:54:00 crc kubenswrapper[4717]: E0308 06:54:00.143642 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="extract-utilities" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.143648 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="extract-utilities" Mar 08 06:54:00 crc kubenswrapper[4717]: E0308 06:54:00.143673 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="extract-content" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.143680 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="extract-content" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.143889 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6221c8b8-3840-47d6-afaa-ac5d3a315e3a" containerName="registry-server" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.144517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.147156 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.147903 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.148381 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.151610 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549214-sz8rk"] Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.299563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psbmr\" (UniqueName: \"kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr\") pod \"auto-csr-approver-29549214-sz8rk\" (UID: \"3846f96d-edbf-4b3e-b9db-8abd444b27b9\") " pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.402524 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psbmr\" (UniqueName: \"kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr\") pod \"auto-csr-approver-29549214-sz8rk\" (UID: \"3846f96d-edbf-4b3e-b9db-8abd444b27b9\") " pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.431847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psbmr\" (UniqueName: \"kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr\") pod \"auto-csr-approver-29549214-sz8rk\" (UID: \"3846f96d-edbf-4b3e-b9db-8abd444b27b9\") " pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:00 crc kubenswrapper[4717]: I0308 06:54:00.460329 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:01 crc kubenswrapper[4717]: I0308 06:54:01.064338 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549214-sz8rk"] Mar 08 06:54:01 crc kubenswrapper[4717]: I0308 06:54:01.588866 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" event={"ID":"3846f96d-edbf-4b3e-b9db-8abd444b27b9","Type":"ContainerStarted","Data":"ab12881b4f5552f2e32b87c8b73f060c6fe8c0809b9c76df2dd0ee192489c596"} Mar 08 06:54:01 crc kubenswrapper[4717]: I0308 06:54:01.705279 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:01 crc kubenswrapper[4717]: I0308 06:54:01.708115 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:01 crc kubenswrapper[4717]: I0308 06:54:01.803598 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:02 crc kubenswrapper[4717]: I0308 06:54:02.599167 4717 generic.go:334] "Generic (PLEG): container finished" podID="3846f96d-edbf-4b3e-b9db-8abd444b27b9" containerID="b5ad4c57144b959b90f11f4c6b639ac217c5d8491716dd26da0ebc22be04b5df" exitCode=0 Mar 08 06:54:02 crc kubenswrapper[4717]: I0308 06:54:02.599346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" event={"ID":"3846f96d-edbf-4b3e-b9db-8abd444b27b9","Type":"ContainerDied","Data":"b5ad4c57144b959b90f11f4c6b639ac217c5d8491716dd26da0ebc22be04b5df"} Mar 08 06:54:02 crc kubenswrapper[4717]: I0308 06:54:02.683393 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:02 crc kubenswrapper[4717]: I0308 06:54:02.729570 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.094383 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.190976 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psbmr\" (UniqueName: \"kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr\") pod \"3846f96d-edbf-4b3e-b9db-8abd444b27b9\" (UID: \"3846f96d-edbf-4b3e-b9db-8abd444b27b9\") " Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.196248 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr" (OuterVolumeSpecName: "kube-api-access-psbmr") pod "3846f96d-edbf-4b3e-b9db-8abd444b27b9" (UID: "3846f96d-edbf-4b3e-b9db-8abd444b27b9"). InnerVolumeSpecName "kube-api-access-psbmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.293629 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psbmr\" (UniqueName: \"kubernetes.io/projected/3846f96d-edbf-4b3e-b9db-8abd444b27b9-kube-api-access-psbmr\") on node \"crc\" DevicePath \"\"" Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.623921 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mwgrx" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="registry-server" containerID="cri-o://f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e" gracePeriod=2 Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.624737 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.627888 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549214-sz8rk" event={"ID":"3846f96d-edbf-4b3e-b9db-8abd444b27b9","Type":"ContainerDied","Data":"ab12881b4f5552f2e32b87c8b73f060c6fe8c0809b9c76df2dd0ee192489c596"} Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.627936 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab12881b4f5552f2e32b87c8b73f060c6fe8c0809b9c76df2dd0ee192489c596" Mar 08 06:54:04 crc kubenswrapper[4717]: I0308 06:54:04.782885 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:54:04 crc kubenswrapper[4717]: E0308 06:54:04.783373 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.147319 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.165100 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549208-7zc26"] Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.189415 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549208-7zc26"] Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.312475 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggznt\" (UniqueName: \"kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt\") pod \"a22a2522-9010-4199-b02c-c583d518faa3\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.313292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities\") pod \"a22a2522-9010-4199-b02c-c583d518faa3\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.313334 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content\") pod \"a22a2522-9010-4199-b02c-c583d518faa3\" (UID: \"a22a2522-9010-4199-b02c-c583d518faa3\") " Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.314326 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities" (OuterVolumeSpecName: "utilities") pod "a22a2522-9010-4199-b02c-c583d518faa3" (UID: "a22a2522-9010-4199-b02c-c583d518faa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.319490 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt" (OuterVolumeSpecName: "kube-api-access-ggznt") pod "a22a2522-9010-4199-b02c-c583d518faa3" (UID: "a22a2522-9010-4199-b02c-c583d518faa3"). InnerVolumeSpecName "kube-api-access-ggznt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.415594 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggznt\" (UniqueName: \"kubernetes.io/projected/a22a2522-9010-4199-b02c-c583d518faa3-kube-api-access-ggznt\") on node \"crc\" DevicePath \"\"" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.415626 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.438525 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a22a2522-9010-4199-b02c-c583d518faa3" (UID: "a22a2522-9010-4199-b02c-c583d518faa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.517420 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a22a2522-9010-4199-b02c-c583d518faa3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.634916 4717 generic.go:334] "Generic (PLEG): container finished" podID="a22a2522-9010-4199-b02c-c583d518faa3" containerID="f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e" exitCode=0 Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.634956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerDied","Data":"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e"} Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.634986 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwgrx" event={"ID":"a22a2522-9010-4199-b02c-c583d518faa3","Type":"ContainerDied","Data":"c27737ef36af4b7a2466a2c72e779baa3b48f7f82cd8e0d2fb4568c7f99cdba7"} Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.635004 4717 scope.go:117] "RemoveContainer" containerID="f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.635024 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwgrx" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.677098 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.686856 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mwgrx"] Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.693055 4717 scope.go:117] "RemoveContainer" containerID="97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.731532 4717 scope.go:117] "RemoveContainer" containerID="773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.766583 4717 scope.go:117] "RemoveContainer" containerID="f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e" Mar 08 06:54:05 crc kubenswrapper[4717]: E0308 06:54:05.766993 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e\": container with ID starting with f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e not found: ID does not exist" containerID="f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.767024 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e"} err="failed to get container status \"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e\": rpc error: code = NotFound desc = could not find container \"f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e\": container with ID starting with f14ff50ecb42c56971e83219c003b6dd651769611bef065cf05b5d03947ed23e not found: ID does not exist" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.767046 4717 scope.go:117] "RemoveContainer" containerID="97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7" Mar 08 06:54:05 crc kubenswrapper[4717]: E0308 06:54:05.767281 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7\": container with ID starting with 97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7 not found: ID does not exist" containerID="97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.767312 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7"} err="failed to get container status \"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7\": rpc error: code = NotFound desc = could not find container \"97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7\": container with ID starting with 97de7a5207bb975efa347e61b1e30ae28426a3efedea5120e6a423a8f5858fb7 not found: ID does not exist" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.767332 4717 scope.go:117] "RemoveContainer" containerID="773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95" Mar 08 06:54:05 crc kubenswrapper[4717]: E0308 06:54:05.767623 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95\": container with ID starting with 773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95 not found: ID does not exist" containerID="773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.767664 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95"} err="failed to get container status \"773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95\": rpc error: code = NotFound desc = could not find container \"773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95\": container with ID starting with 773d000fbf0cd751eb8742b881e3e3836f3ab971c4bc68b21dbe1069eba88b95 not found: ID does not exist" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.796235 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22a2522-9010-4199-b02c-c583d518faa3" path="/var/lib/kubelet/pods/a22a2522-9010-4199-b02c-c583d518faa3/volumes" Mar 08 06:54:05 crc kubenswrapper[4717]: I0308 06:54:05.796901 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17c651a-fec7-4506-af80-32caddc1fb76" path="/var/lib/kubelet/pods/e17c651a-fec7-4506-af80-32caddc1fb76/volumes" Mar 08 06:54:16 crc kubenswrapper[4717]: I0308 06:54:16.782012 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:54:16 crc kubenswrapper[4717]: E0308 06:54:16.783407 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:54:27 crc kubenswrapper[4717]: I0308 06:54:27.782678 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:54:27 crc kubenswrapper[4717]: E0308 06:54:27.783874 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:54:40 crc kubenswrapper[4717]: I0308 06:54:40.458164 4717 scope.go:117] "RemoveContainer" containerID="46c0b0f0d18c747ad4d4aca672d8b289cb8ae0c150c1b54aeba9fd3f44f43f12" Mar 08 06:54:40 crc kubenswrapper[4717]: I0308 06:54:40.782220 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:54:40 crc kubenswrapper[4717]: E0308 06:54:40.782798 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:54:52 crc kubenswrapper[4717]: I0308 06:54:52.782005 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:54:52 crc kubenswrapper[4717]: E0308 06:54:52.783179 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:55:03 crc kubenswrapper[4717]: I0308 06:55:03.804298 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:55:03 crc kubenswrapper[4717]: E0308 06:55:03.805669 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:55:17 crc kubenswrapper[4717]: I0308 06:55:17.782513 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:55:17 crc kubenswrapper[4717]: E0308 06:55:17.783331 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:55:25 crc kubenswrapper[4717]: I0308 06:55:25.591411 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e0647cd-807a-44fc-a1e0-f5ce609b835d" containerID="ecc674a5ac53902011c233ddb8c66be59ee08e6485b51ba17f5f51c4176e67a8" exitCode=0 Mar 08 06:55:25 crc kubenswrapper[4717]: I0308 06:55:25.591527 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0e0647cd-807a-44fc-a1e0-f5ce609b835d","Type":"ContainerDied","Data":"ecc674a5ac53902011c233ddb8c66be59ee08e6485b51ba17f5f51c4176e67a8"} Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.056017 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.150676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.150972 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151226 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151365 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151454 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151541 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2qwn\" (UniqueName: \"kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.151625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs\") pod \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\" (UID: \"0e0647cd-807a-44fc-a1e0-f5ce609b835d\") " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.152199 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.152274 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data" (OuterVolumeSpecName: "config-data") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.152785 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.152805 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.155183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.156728 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn" (OuterVolumeSpecName: "kube-api-access-b2qwn") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "kube-api-access-b2qwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.156878 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.188005 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.189697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.195367 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.200502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0e0647cd-807a-44fc-a1e0-f5ce609b835d" (UID: "0e0647cd-807a-44fc-a1e0-f5ce609b835d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254732 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254825 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254847 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254867 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0e0647cd-807a-44fc-a1e0-f5ce609b835d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254888 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254908 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2qwn\" (UniqueName: \"kubernetes.io/projected/0e0647cd-807a-44fc-a1e0-f5ce609b835d-kube-api-access-b2qwn\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.254925 4717 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0e0647cd-807a-44fc-a1e0-f5ce609b835d-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.285754 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.356634 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.626035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0e0647cd-807a-44fc-a1e0-f5ce609b835d","Type":"ContainerDied","Data":"d520811e1c7616ff6434ff78047258ddac782960a76a26a86428bc6c652af646"} Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.626079 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d520811e1c7616ff6434ff78047258ddac782960a76a26a86428bc6c652af646" Mar 08 06:55:27 crc kubenswrapper[4717]: I0308 06:55:27.626134 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 08 06:55:28 crc kubenswrapper[4717]: I0308 06:55:28.782788 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:55:28 crc kubenswrapper[4717]: E0308 06:55:28.783820 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.532881 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 08 06:55:37 crc kubenswrapper[4717]: E0308 06:55:37.537001 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="extract-content" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537035 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="extract-content" Mar 08 06:55:37 crc kubenswrapper[4717]: E0308 06:55:37.537067 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3846f96d-edbf-4b3e-b9db-8abd444b27b9" containerName="oc" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537079 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3846f96d-edbf-4b3e-b9db-8abd444b27b9" containerName="oc" Mar 08 06:55:37 crc kubenswrapper[4717]: E0308 06:55:37.537214 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="extract-utilities" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537230 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="extract-utilities" Mar 08 06:55:37 crc kubenswrapper[4717]: E0308 06:55:37.537303 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0647cd-807a-44fc-a1e0-f5ce609b835d" containerName="tempest-tests-tempest-tests-runner" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537317 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0647cd-807a-44fc-a1e0-f5ce609b835d" containerName="tempest-tests-tempest-tests-runner" Mar 08 06:55:37 crc kubenswrapper[4717]: E0308 06:55:37.537333 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="registry-server" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537344 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="registry-server" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537650 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0647cd-807a-44fc-a1e0-f5ce609b835d" containerName="tempest-tests-tempest-tests-runner" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537716 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22a2522-9010-4199-b02c-c583d518faa3" containerName="registry-server" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.537732 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3846f96d-edbf-4b3e-b9db-8abd444b27b9" containerName="oc" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.538768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.543494 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w5njs" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.543942 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.584847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkmw\" (UniqueName: \"kubernetes.io/projected/d536dc5d-11cf-4b1a-81bc-a17b532c9baa-kube-api-access-rwkmw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.584934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.687340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkmw\" (UniqueName: \"kubernetes.io/projected/d536dc5d-11cf-4b1a-81bc-a17b532c9baa-kube-api-access-rwkmw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.687531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.688177 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.709328 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkmw\" (UniqueName: \"kubernetes.io/projected/d536dc5d-11cf-4b1a-81bc-a17b532c9baa-kube-api-access-rwkmw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.741851 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d536dc5d-11cf-4b1a-81bc-a17b532c9baa\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:37 crc kubenswrapper[4717]: I0308 06:55:37.868622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 08 06:55:38 crc kubenswrapper[4717]: I0308 06:55:38.439675 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 08 06:55:38 crc kubenswrapper[4717]: I0308 06:55:38.743095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d536dc5d-11cf-4b1a-81bc-a17b532c9baa","Type":"ContainerStarted","Data":"416ce72179a6bea37a74f2f9b39a2bdd438fa9f0eca5b7a08bf8ae93f59cd2e6"} Mar 08 06:55:40 crc kubenswrapper[4717]: I0308 06:55:40.762230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d536dc5d-11cf-4b1a-81bc-a17b532c9baa","Type":"ContainerStarted","Data":"e7d260c122db95c08305b7f88a1734aea7b1da003b59278dfda405d740e5ecd9"} Mar 08 06:55:40 crc kubenswrapper[4717]: I0308 06:55:40.786779 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.819634896 podStartE2EDuration="3.786760001s" podCreationTimestamp="2026-03-08 06:55:37 +0000 UTC" firstStartedPulling="2026-03-08 06:55:38.455709449 +0000 UTC m=+5365.373358293" lastFinishedPulling="2026-03-08 06:55:40.422834514 +0000 UTC m=+5367.340483398" observedRunningTime="2026-03-08 06:55:40.780365664 +0000 UTC m=+5367.698014528" watchObservedRunningTime="2026-03-08 06:55:40.786760001 +0000 UTC m=+5367.704408855" Mar 08 06:55:41 crc kubenswrapper[4717]: I0308 06:55:41.781559 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:55:42 crc kubenswrapper[4717]: I0308 06:55:42.787628 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2"} Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.142104 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549216-xz7dn"] Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.143973 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.146324 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.146607 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.146944 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.153256 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549216-xz7dn"] Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.310328 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnvv\" (UniqueName: \"kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv\") pod \"auto-csr-approver-29549216-xz7dn\" (UID: \"07e1d41e-d3f2-4832-920d-38d43f4ef25f\") " pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.411998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnvv\" (UniqueName: \"kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv\") pod \"auto-csr-approver-29549216-xz7dn\" (UID: \"07e1d41e-d3f2-4832-920d-38d43f4ef25f\") " pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.440474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnvv\" (UniqueName: \"kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv\") pod \"auto-csr-approver-29549216-xz7dn\" (UID: \"07e1d41e-d3f2-4832-920d-38d43f4ef25f\") " pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.464472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:00 crc kubenswrapper[4717]: I0308 06:56:00.940871 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549216-xz7dn"] Mar 08 06:56:00 crc kubenswrapper[4717]: W0308 06:56:00.973950 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07e1d41e_d3f2_4832_920d_38d43f4ef25f.slice/crio-76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5 WatchSource:0}: Error finding container 76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5: Status 404 returned error can't find the container with id 76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5 Mar 08 06:56:01 crc kubenswrapper[4717]: I0308 06:56:01.039560 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" event={"ID":"07e1d41e-d3f2-4832-920d-38d43f4ef25f","Type":"ContainerStarted","Data":"76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5"} Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.829417 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsgn/must-gather-gwp6z"] Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.831392 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.834575 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4xsgn"/"kube-root-ca.crt" Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.848072 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4xsgn"/"default-dockercfg-h9rwr" Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.848548 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4xsgn"/"openshift-service-ca.crt" Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.923329 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4xsgn/must-gather-gwp6z"] Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.968842 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzjg4\" (UniqueName: \"kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:02 crc kubenswrapper[4717]: I0308 06:56:02.968938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.066960 4717 generic.go:334] "Generic (PLEG): container finished" podID="07e1d41e-d3f2-4832-920d-38d43f4ef25f" containerID="7a6b96c58da60e255cd2be0a2630c58ca02cf2e3d940d1d432b9c7246294432c" exitCode=0 Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.067000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" event={"ID":"07e1d41e-d3f2-4832-920d-38d43f4ef25f","Type":"ContainerDied","Data":"7a6b96c58da60e255cd2be0a2630c58ca02cf2e3d940d1d432b9c7246294432c"} Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.070194 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjg4\" (UniqueName: \"kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.070372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.070803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.104542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzjg4\" (UniqueName: \"kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4\") pod \"must-gather-gwp6z\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.145008 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 06:56:03 crc kubenswrapper[4717]: I0308 06:56:03.607376 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4xsgn/must-gather-gwp6z"] Mar 08 06:56:04 crc kubenswrapper[4717]: I0308 06:56:04.474280 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:04 crc kubenswrapper[4717]: I0308 06:56:04.600409 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxnvv\" (UniqueName: \"kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv\") pod \"07e1d41e-d3f2-4832-920d-38d43f4ef25f\" (UID: \"07e1d41e-d3f2-4832-920d-38d43f4ef25f\") " Mar 08 06:56:04 crc kubenswrapper[4717]: I0308 06:56:04.605364 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv" (OuterVolumeSpecName: "kube-api-access-fxnvv") pod "07e1d41e-d3f2-4832-920d-38d43f4ef25f" (UID: "07e1d41e-d3f2-4832-920d-38d43f4ef25f"). InnerVolumeSpecName "kube-api-access-fxnvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:56:04 crc kubenswrapper[4717]: I0308 06:56:04.702805 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxnvv\" (UniqueName: \"kubernetes.io/projected/07e1d41e-d3f2-4832-920d-38d43f4ef25f-kube-api-access-fxnvv\") on node \"crc\" DevicePath \"\"" Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.098643 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.098674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549216-xz7dn" event={"ID":"07e1d41e-d3f2-4832-920d-38d43f4ef25f","Type":"ContainerDied","Data":"76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5"} Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.099102 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76345d99a144327dc3f5924977bcbfd69bd5c4ab5aa9567bcfff11ca00f518a5" Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.100886 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" event={"ID":"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4","Type":"ContainerStarted","Data":"fc5eb37cd74d8b11e5ed8a08bca43cfa274e3dc0c1e80a5fde5aecc293cda230"} Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.554654 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549210-pgr86"] Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.562154 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549210-pgr86"] Mar 08 06:56:05 crc kubenswrapper[4717]: I0308 06:56:05.797000 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957225ec-c79c-4068-ad77-76c63b57d468" path="/var/lib/kubelet/pods/957225ec-c79c-4068-ad77-76c63b57d468/volumes" Mar 08 06:56:10 crc kubenswrapper[4717]: I0308 06:56:10.205210 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" event={"ID":"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4","Type":"ContainerStarted","Data":"aedada7002bc7cd903862db28b3be4af657d9492ab80d72f555cb1fef158dea9"} Mar 08 06:56:11 crc kubenswrapper[4717]: I0308 06:56:11.220205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" event={"ID":"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4","Type":"ContainerStarted","Data":"28f714a42332a54d7a953c2e34d6c3fb49fb21e4759cb15f75f51d0d71abd5a1"} Mar 08 06:56:11 crc kubenswrapper[4717]: I0308 06:56:11.248541 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" podStartSLOduration=3.450340324 podStartE2EDuration="9.248520245s" podCreationTimestamp="2026-03-08 06:56:02 +0000 UTC" firstStartedPulling="2026-03-08 06:56:04.089911797 +0000 UTC m=+5391.007560641" lastFinishedPulling="2026-03-08 06:56:09.888091718 +0000 UTC m=+5396.805740562" observedRunningTime="2026-03-08 06:56:11.243756668 +0000 UTC m=+5398.161405512" watchObservedRunningTime="2026-03-08 06:56:11.248520245 +0000 UTC m=+5398.166169099" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.405755 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-2k9nn"] Mar 08 06:56:14 crc kubenswrapper[4717]: E0308 06:56:14.406673 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e1d41e-d3f2-4832-920d-38d43f4ef25f" containerName="oc" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.406701 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e1d41e-d3f2-4832-920d-38d43f4ef25f" containerName="oc" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.406896 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e1d41e-d3f2-4832-920d-38d43f4ef25f" containerName="oc" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.407467 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.531802 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.532115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxwk9\" (UniqueName: \"kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.634749 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.634862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxwk9\" (UniqueName: \"kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:14 crc kubenswrapper[4717]: I0308 06:56:14.634924 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:15 crc kubenswrapper[4717]: I0308 06:56:15.179956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxwk9\" (UniqueName: \"kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9\") pod \"crc-debug-2k9nn\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:15 crc kubenswrapper[4717]: I0308 06:56:15.324744 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:56:15 crc kubenswrapper[4717]: W0308 06:56:15.379594 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e29b26b_97c8_40f3_81a3_969451555f55.slice/crio-0658669ca7ad76cdeed8fee7fc4774ef09d1e76ea4f317dcac57d92ac18e1422 WatchSource:0}: Error finding container 0658669ca7ad76cdeed8fee7fc4774ef09d1e76ea4f317dcac57d92ac18e1422: Status 404 returned error can't find the container with id 0658669ca7ad76cdeed8fee7fc4774ef09d1e76ea4f317dcac57d92ac18e1422 Mar 08 06:56:16 crc kubenswrapper[4717]: I0308 06:56:16.264772 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" event={"ID":"2e29b26b-97c8-40f3-81a3-969451555f55","Type":"ContainerStarted","Data":"0658669ca7ad76cdeed8fee7fc4774ef09d1e76ea4f317dcac57d92ac18e1422"} Mar 08 06:56:27 crc kubenswrapper[4717]: I0308 06:56:27.373825 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" event={"ID":"2e29b26b-97c8-40f3-81a3-969451555f55","Type":"ContainerStarted","Data":"9dc558028b5ebadc8dc862e99575c31c6f66a8c077fd0b61173723f5236ebba3"} Mar 08 06:56:27 crc kubenswrapper[4717]: I0308 06:56:27.387492 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" podStartSLOduration=2.562713542 podStartE2EDuration="13.387452615s" podCreationTimestamp="2026-03-08 06:56:14 +0000 UTC" firstStartedPulling="2026-03-08 06:56:15.381654559 +0000 UTC m=+5402.299303403" lastFinishedPulling="2026-03-08 06:56:26.206393632 +0000 UTC m=+5413.124042476" observedRunningTime="2026-03-08 06:56:27.385116377 +0000 UTC m=+5414.302765231" watchObservedRunningTime="2026-03-08 06:56:27.387452615 +0000 UTC m=+5414.305101459" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.665912 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.702270 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.704772 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.868355 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.868459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.868529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xpf\" (UniqueName: \"kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.969904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.970207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xpf\" (UniqueName: \"kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.970325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.970449 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:35 crc kubenswrapper[4717]: I0308 06:56:35.970775 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:36 crc kubenswrapper[4717]: I0308 06:56:36.002162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xpf\" (UniqueName: \"kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf\") pod \"community-operators-rcchl\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:36 crc kubenswrapper[4717]: I0308 06:56:36.028916 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:36 crc kubenswrapper[4717]: I0308 06:56:36.637029 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:37 crc kubenswrapper[4717]: I0308 06:56:37.460109 4717 generic.go:334] "Generic (PLEG): container finished" podID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerID="b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6" exitCode=0 Mar 08 06:56:37 crc kubenswrapper[4717]: I0308 06:56:37.460529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerDied","Data":"b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6"} Mar 08 06:56:37 crc kubenswrapper[4717]: I0308 06:56:37.460555 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerStarted","Data":"b0e0cbdb8b8023379805d35d73c27074e9fe94c44ec782366d057d9af4258737"} Mar 08 06:56:38 crc kubenswrapper[4717]: I0308 06:56:38.498367 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerStarted","Data":"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de"} Mar 08 06:56:39 crc kubenswrapper[4717]: I0308 06:56:39.525255 4717 generic.go:334] "Generic (PLEG): container finished" podID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerID="04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de" exitCode=0 Mar 08 06:56:39 crc kubenswrapper[4717]: I0308 06:56:39.525336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerDied","Data":"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de"} Mar 08 06:56:40 crc kubenswrapper[4717]: I0308 06:56:40.638277 4717 scope.go:117] "RemoveContainer" containerID="c5b4c2a9081f8c4abdf30cc565b88042590932bef05d4198d7e29250d9b40ccf" Mar 08 06:56:42 crc kubenswrapper[4717]: I0308 06:56:42.555537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerStarted","Data":"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159"} Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.030257 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.030872 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.095878 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.114739 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rcchl" podStartSLOduration=8.61212418 podStartE2EDuration="11.114721895s" podCreationTimestamp="2026-03-08 06:56:35 +0000 UTC" firstStartedPulling="2026-03-08 06:56:37.46295957 +0000 UTC m=+5424.380608414" lastFinishedPulling="2026-03-08 06:56:39.965557285 +0000 UTC m=+5426.883206129" observedRunningTime="2026-03-08 06:56:42.582188959 +0000 UTC m=+5429.499837823" watchObservedRunningTime="2026-03-08 06:56:46.114721895 +0000 UTC m=+5433.032370749" Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.658314 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:46 crc kubenswrapper[4717]: I0308 06:56:46.710933 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:48 crc kubenswrapper[4717]: I0308 06:56:48.614278 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rcchl" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="registry-server" containerID="cri-o://d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159" gracePeriod=2 Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.102351 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.141481 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content\") pod \"5def8165-9123-4991-8ee4-e7dd949cda7d\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.141611 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4xpf\" (UniqueName: \"kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf\") pod \"5def8165-9123-4991-8ee4-e7dd949cda7d\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.141673 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities\") pod \"5def8165-9123-4991-8ee4-e7dd949cda7d\" (UID: \"5def8165-9123-4991-8ee4-e7dd949cda7d\") " Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.143169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities" (OuterVolumeSpecName: "utilities") pod "5def8165-9123-4991-8ee4-e7dd949cda7d" (UID: "5def8165-9123-4991-8ee4-e7dd949cda7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.174847 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf" (OuterVolumeSpecName: "kube-api-access-g4xpf") pod "5def8165-9123-4991-8ee4-e7dd949cda7d" (UID: "5def8165-9123-4991-8ee4-e7dd949cda7d"). InnerVolumeSpecName "kube-api-access-g4xpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.224538 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5def8165-9123-4991-8ee4-e7dd949cda7d" (UID: "5def8165-9123-4991-8ee4-e7dd949cda7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.243877 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.243950 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4xpf\" (UniqueName: \"kubernetes.io/projected/5def8165-9123-4991-8ee4-e7dd949cda7d-kube-api-access-g4xpf\") on node \"crc\" DevicePath \"\"" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.243967 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5def8165-9123-4991-8ee4-e7dd949cda7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.623383 4717 generic.go:334] "Generic (PLEG): container finished" podID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerID="d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159" exitCode=0 Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.623493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerDied","Data":"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159"} Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.623753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcchl" event={"ID":"5def8165-9123-4991-8ee4-e7dd949cda7d","Type":"ContainerDied","Data":"b0e0cbdb8b8023379805d35d73c27074e9fe94c44ec782366d057d9af4258737"} Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.623778 4717 scope.go:117] "RemoveContainer" containerID="d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.623544 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcchl" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.642376 4717 scope.go:117] "RemoveContainer" containerID="04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.668580 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.679384 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rcchl"] Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.684313 4717 scope.go:117] "RemoveContainer" containerID="b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.723361 4717 scope.go:117] "RemoveContainer" containerID="d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159" Mar 08 06:56:49 crc kubenswrapper[4717]: E0308 06:56:49.726061 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159\": container with ID starting with d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159 not found: ID does not exist" containerID="d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.726104 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159"} err="failed to get container status \"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159\": rpc error: code = NotFound desc = could not find container \"d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159\": container with ID starting with d248279e35655c397837393d8cd37af5cad7bd3d29d73ce91ff0b250e5f80159 not found: ID does not exist" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.726130 4717 scope.go:117] "RemoveContainer" containerID="04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de" Mar 08 06:56:49 crc kubenswrapper[4717]: E0308 06:56:49.727031 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de\": container with ID starting with 04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de not found: ID does not exist" containerID="04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.727059 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de"} err="failed to get container status \"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de\": rpc error: code = NotFound desc = could not find container \"04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de\": container with ID starting with 04ae09071603c762ab7a520bb9be8d1a46ef710663b88e477809be3a852353de not found: ID does not exist" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.727075 4717 scope.go:117] "RemoveContainer" containerID="b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6" Mar 08 06:56:49 crc kubenswrapper[4717]: E0308 06:56:49.727483 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6\": container with ID starting with b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6 not found: ID does not exist" containerID="b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.727508 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6"} err="failed to get container status \"b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6\": rpc error: code = NotFound desc = could not find container \"b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6\": container with ID starting with b04dca59519cde4fb4c94a907a28b460c5806eee6a940fe460de225f878982d6 not found: ID does not exist" Mar 08 06:56:49 crc kubenswrapper[4717]: I0308 06:56:49.796553 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" path="/var/lib/kubelet/pods/5def8165-9123-4991-8ee4-e7dd949cda7d/volumes" Mar 08 06:57:15 crc kubenswrapper[4717]: I0308 06:57:15.868866 4717 generic.go:334] "Generic (PLEG): container finished" podID="2e29b26b-97c8-40f3-81a3-969451555f55" containerID="9dc558028b5ebadc8dc862e99575c31c6f66a8c077fd0b61173723f5236ebba3" exitCode=0 Mar 08 06:57:15 crc kubenswrapper[4717]: I0308 06:57:15.868954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" event={"ID":"2e29b26b-97c8-40f3-81a3-969451555f55","Type":"ContainerDied","Data":"9dc558028b5ebadc8dc862e99575c31c6f66a8c077fd0b61173723f5236ebba3"} Mar 08 06:57:16 crc kubenswrapper[4717]: I0308 06:57:16.987567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.008302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host\") pod \"2e29b26b-97c8-40f3-81a3-969451555f55\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.008462 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxwk9\" (UniqueName: \"kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9\") pod \"2e29b26b-97c8-40f3-81a3-969451555f55\" (UID: \"2e29b26b-97c8-40f3-81a3-969451555f55\") " Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.008455 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host" (OuterVolumeSpecName: "host") pod "2e29b26b-97c8-40f3-81a3-969451555f55" (UID: "2e29b26b-97c8-40f3-81a3-969451555f55"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.008993 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e29b26b-97c8-40f3-81a3-969451555f55-host\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.024183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9" (OuterVolumeSpecName: "kube-api-access-rxwk9") pod "2e29b26b-97c8-40f3-81a3-969451555f55" (UID: "2e29b26b-97c8-40f3-81a3-969451555f55"). InnerVolumeSpecName "kube-api-access-rxwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.039405 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-2k9nn"] Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.046260 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-2k9nn"] Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.111142 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxwk9\" (UniqueName: \"kubernetes.io/projected/2e29b26b-97c8-40f3-81a3-969451555f55-kube-api-access-rxwk9\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.792454 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e29b26b-97c8-40f3-81a3-969451555f55" path="/var/lib/kubelet/pods/2e29b26b-97c8-40f3-81a3-969451555f55/volumes" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.889760 4717 scope.go:117] "RemoveContainer" containerID="9dc558028b5ebadc8dc862e99575c31c6f66a8c077fd0b61173723f5236ebba3" Mar 08 06:57:17 crc kubenswrapper[4717]: I0308 06:57:17.889802 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-2k9nn" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.350389 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-q8xns"] Mar 08 06:57:18 crc kubenswrapper[4717]: E0308 06:57:18.350879 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="extract-utilities" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.350893 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="extract-utilities" Mar 08 06:57:18 crc kubenswrapper[4717]: E0308 06:57:18.350918 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="extract-content" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.350924 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="extract-content" Mar 08 06:57:18 crc kubenswrapper[4717]: E0308 06:57:18.350934 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="registry-server" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.350941 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="registry-server" Mar 08 06:57:18 crc kubenswrapper[4717]: E0308 06:57:18.350959 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e29b26b-97c8-40f3-81a3-969451555f55" containerName="container-00" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.350965 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e29b26b-97c8-40f3-81a3-969451555f55" containerName="container-00" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.351332 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e29b26b-97c8-40f3-81a3-969451555f55" containerName="container-00" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.351376 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5def8165-9123-4991-8ee4-e7dd949cda7d" containerName="registry-server" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.352279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.434054 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg55b\" (UniqueName: \"kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.434124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.536249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg55b\" (UniqueName: \"kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.536372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.536558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.561797 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg55b\" (UniqueName: \"kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b\") pod \"crc-debug-q8xns\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.670930 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:18 crc kubenswrapper[4717]: I0308 06:57:18.899354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" event={"ID":"5add2fb6-31fe-4318-a9a3-96464de4143c","Type":"ContainerStarted","Data":"b5eb91b560022419fc14a66c96b36c2387ac1447b0b34c5520ba39f28454486b"} Mar 08 06:57:19 crc kubenswrapper[4717]: I0308 06:57:19.912777 4717 generic.go:334] "Generic (PLEG): container finished" podID="5add2fb6-31fe-4318-a9a3-96464de4143c" containerID="20decba40c27a2286c4e05415bc45caab45a1a637322d6d592996da59bf63708" exitCode=0 Mar 08 06:57:19 crc kubenswrapper[4717]: I0308 06:57:19.912887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" event={"ID":"5add2fb6-31fe-4318-a9a3-96464de4143c","Type":"ContainerDied","Data":"20decba40c27a2286c4e05415bc45caab45a1a637322d6d592996da59bf63708"} Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.012794 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.075819 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host\") pod \"5add2fb6-31fe-4318-a9a3-96464de4143c\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.075965 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host" (OuterVolumeSpecName: "host") pod "5add2fb6-31fe-4318-a9a3-96464de4143c" (UID: "5add2fb6-31fe-4318-a9a3-96464de4143c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.076094 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg55b\" (UniqueName: \"kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b\") pod \"5add2fb6-31fe-4318-a9a3-96464de4143c\" (UID: \"5add2fb6-31fe-4318-a9a3-96464de4143c\") " Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.076661 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5add2fb6-31fe-4318-a9a3-96464de4143c-host\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.081309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b" (OuterVolumeSpecName: "kube-api-access-vg55b") pod "5add2fb6-31fe-4318-a9a3-96464de4143c" (UID: "5add2fb6-31fe-4318-a9a3-96464de4143c"). InnerVolumeSpecName "kube-api-access-vg55b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.177975 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg55b\" (UniqueName: \"kubernetes.io/projected/5add2fb6-31fe-4318-a9a3-96464de4143c-kube-api-access-vg55b\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.930867 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" event={"ID":"5add2fb6-31fe-4318-a9a3-96464de4143c","Type":"ContainerDied","Data":"b5eb91b560022419fc14a66c96b36c2387ac1447b0b34c5520ba39f28454486b"} Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.930905 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-q8xns" Mar 08 06:57:21 crc kubenswrapper[4717]: I0308 06:57:21.930923 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5eb91b560022419fc14a66c96b36c2387ac1447b0b34c5520ba39f28454486b" Mar 08 06:57:22 crc kubenswrapper[4717]: I0308 06:57:22.107462 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-q8xns"] Mar 08 06:57:22 crc kubenswrapper[4717]: I0308 06:57:22.126928 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-q8xns"] Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.446354 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-rp8qs"] Mar 08 06:57:23 crc kubenswrapper[4717]: E0308 06:57:23.447013 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5add2fb6-31fe-4318-a9a3-96464de4143c" containerName="container-00" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.447026 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5add2fb6-31fe-4318-a9a3-96464de4143c" containerName="container-00" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.447222 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5add2fb6-31fe-4318-a9a3-96464de4143c" containerName="container-00" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.447916 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.514298 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.514404 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtft6\" (UniqueName: \"kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.616826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtft6\" (UniqueName: \"kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.617074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.617279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.645505 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtft6\" (UniqueName: \"kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6\") pod \"crc-debug-rp8qs\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.764674 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.802608 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5add2fb6-31fe-4318-a9a3-96464de4143c" path="/var/lib/kubelet/pods/5add2fb6-31fe-4318-a9a3-96464de4143c/volumes" Mar 08 06:57:23 crc kubenswrapper[4717]: W0308 06:57:23.820908 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d606f11_614c_418c_b13a_a6768a17243b.slice/crio-92537f199e3255ba6adae2279fdef956d485ed5e1e89f1969c1b3b2dad30a4e8 WatchSource:0}: Error finding container 92537f199e3255ba6adae2279fdef956d485ed5e1e89f1969c1b3b2dad30a4e8: Status 404 returned error can't find the container with id 92537f199e3255ba6adae2279fdef956d485ed5e1e89f1969c1b3b2dad30a4e8 Mar 08 06:57:23 crc kubenswrapper[4717]: I0308 06:57:23.964977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" event={"ID":"2d606f11-614c-418c-b13a-a6768a17243b","Type":"ContainerStarted","Data":"92537f199e3255ba6adae2279fdef956d485ed5e1e89f1969c1b3b2dad30a4e8"} Mar 08 06:57:24 crc kubenswrapper[4717]: I0308 06:57:24.980620 4717 generic.go:334] "Generic (PLEG): container finished" podID="2d606f11-614c-418c-b13a-a6768a17243b" containerID="03df09ecb3bf4871c836b3234aa70122393aad80eb9963daf8ea91e0d8c55670" exitCode=0 Mar 08 06:57:24 crc kubenswrapper[4717]: I0308 06:57:24.980804 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" event={"ID":"2d606f11-614c-418c-b13a-a6768a17243b","Type":"ContainerDied","Data":"03df09ecb3bf4871c836b3234aa70122393aad80eb9963daf8ea91e0d8c55670"} Mar 08 06:57:25 crc kubenswrapper[4717]: I0308 06:57:25.045573 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-rp8qs"] Mar 08 06:57:25 crc kubenswrapper[4717]: I0308 06:57:25.056457 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsgn/crc-debug-rp8qs"] Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.122031 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.173078 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtft6\" (UniqueName: \"kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6\") pod \"2d606f11-614c-418c-b13a-a6768a17243b\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.173485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host\") pod \"2d606f11-614c-418c-b13a-a6768a17243b\" (UID: \"2d606f11-614c-418c-b13a-a6768a17243b\") " Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.174417 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host" (OuterVolumeSpecName: "host") pod "2d606f11-614c-418c-b13a-a6768a17243b" (UID: "2d606f11-614c-418c-b13a-a6768a17243b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.183170 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6" (OuterVolumeSpecName: "kube-api-access-vtft6") pod "2d606f11-614c-418c-b13a-a6768a17243b" (UID: "2d606f11-614c-418c-b13a-a6768a17243b"). InnerVolumeSpecName "kube-api-access-vtft6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.276487 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtft6\" (UniqueName: \"kubernetes.io/projected/2d606f11-614c-418c-b13a-a6768a17243b-kube-api-access-vtft6\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:26 crc kubenswrapper[4717]: I0308 06:57:26.276533 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d606f11-614c-418c-b13a-a6768a17243b-host\") on node \"crc\" DevicePath \"\"" Mar 08 06:57:27 crc kubenswrapper[4717]: I0308 06:57:27.003341 4717 scope.go:117] "RemoveContainer" containerID="03df09ecb3bf4871c836b3234aa70122393aad80eb9963daf8ea91e0d8c55670" Mar 08 06:57:27 crc kubenswrapper[4717]: I0308 06:57:27.003402 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/crc-debug-rp8qs" Mar 08 06:57:27 crc kubenswrapper[4717]: I0308 06:57:27.798103 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d606f11-614c-418c-b13a-a6768a17243b" path="/var/lib/kubelet/pods/2d606f11-614c-418c-b13a-a6768a17243b/volumes" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.188537 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549218-ws8jp"] Mar 08 06:58:00 crc kubenswrapper[4717]: E0308 06:58:00.189971 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d606f11-614c-418c-b13a-a6768a17243b" containerName="container-00" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.189989 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d606f11-614c-418c-b13a-a6768a17243b" containerName="container-00" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.190666 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d606f11-614c-418c-b13a-a6768a17243b" containerName="container-00" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.193662 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.197877 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.197932 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.198640 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.207353 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549218-ws8jp"] Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.302072 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ggt9\" (UniqueName: \"kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9\") pod \"auto-csr-approver-29549218-ws8jp\" (UID: \"23243fde-3416-4769-9537-e21316883021\") " pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.404414 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ggt9\" (UniqueName: \"kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9\") pod \"auto-csr-approver-29549218-ws8jp\" (UID: \"23243fde-3416-4769-9537-e21316883021\") " pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.425722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ggt9\" (UniqueName: \"kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9\") pod \"auto-csr-approver-29549218-ws8jp\" (UID: \"23243fde-3416-4769-9537-e21316883021\") " pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:00 crc kubenswrapper[4717]: I0308 06:58:00.528404 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:01 crc kubenswrapper[4717]: I0308 06:58:01.032037 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549218-ws8jp"] Mar 08 06:58:01 crc kubenswrapper[4717]: I0308 06:58:01.039956 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 06:58:01 crc kubenswrapper[4717]: I0308 06:58:01.341914 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" event={"ID":"23243fde-3416-4769-9537-e21316883021","Type":"ContainerStarted","Data":"d5be247b8386f0a26c8ed03a99ca406681526ac3e59beba95a6eb57e69cd5840"} Mar 08 06:58:02 crc kubenswrapper[4717]: I0308 06:58:02.357275 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" event={"ID":"23243fde-3416-4769-9537-e21316883021","Type":"ContainerStarted","Data":"8c8384607ba0d268d4e3239b969696407618d30a0c1e34878c835a94f0a587ad"} Mar 08 06:58:02 crc kubenswrapper[4717]: I0308 06:58:02.385325 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" podStartSLOduration=1.5049753890000002 podStartE2EDuration="2.385305847s" podCreationTimestamp="2026-03-08 06:58:00 +0000 UTC" firstStartedPulling="2026-03-08 06:58:01.039716094 +0000 UTC m=+5507.957364938" lastFinishedPulling="2026-03-08 06:58:01.920046552 +0000 UTC m=+5508.837695396" observedRunningTime="2026-03-08 06:58:02.374520112 +0000 UTC m=+5509.292168956" watchObservedRunningTime="2026-03-08 06:58:02.385305847 +0000 UTC m=+5509.302954691" Mar 08 06:58:03 crc kubenswrapper[4717]: I0308 06:58:03.370314 4717 generic.go:334] "Generic (PLEG): container finished" podID="23243fde-3416-4769-9537-e21316883021" containerID="8c8384607ba0d268d4e3239b969696407618d30a0c1e34878c835a94f0a587ad" exitCode=0 Mar 08 06:58:03 crc kubenswrapper[4717]: I0308 06:58:03.370372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" event={"ID":"23243fde-3416-4769-9537-e21316883021","Type":"ContainerDied","Data":"8c8384607ba0d268d4e3239b969696407618d30a0c1e34878c835a94f0a587ad"} Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.111328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-569c95cff8-l9lj5_24f496e4-7d01-447c-9ebb-9da7b333d817/barbican-api/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.120301 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.120619 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.132042 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-569c95cff8-l9lj5_24f496e4-7d01-447c-9ebb-9da7b333d817/barbican-api-log/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.360241 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79d58bf7f8-bq7ms_47805c28-c90d-4882-a0ed-5e531fb545b4/barbican-keystone-listener/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.369803 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79d58bf7f8-bq7ms_47805c28-c90d-4882-a0ed-5e531fb545b4/barbican-keystone-listener-log/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.455417 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f775c4c7-brpx4_d7bf9dbc-ec82-4659-92fe-509f95574ef3/barbican-worker/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.701471 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f775c4c7-brpx4_d7bf9dbc-ec82-4659-92fe-509f95574ef3/barbican-worker-log/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.752889 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.769676 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz_d8682143-56c7-442e-987a-d9da77fbe879/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.896371 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ggt9\" (UniqueName: \"kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9\") pod \"23243fde-3416-4769-9537-e21316883021\" (UID: \"23243fde-3416-4769-9537-e21316883021\") " Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.902568 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9" (OuterVolumeSpecName: "kube-api-access-6ggt9") pod "23243fde-3416-4769-9537-e21316883021" (UID: "23243fde-3416-4769-9537-e21316883021"). InnerVolumeSpecName "kube-api-access-6ggt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.975489 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/ceilometer-notification-agent/0.log" Mar 08 06:58:04 crc kubenswrapper[4717]: I0308 06:58:04.998493 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ggt9\" (UniqueName: \"kubernetes.io/projected/23243fde-3416-4769-9537-e21316883021-kube-api-access-6ggt9\") on node \"crc\" DevicePath \"\"" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.038784 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/proxy-httpd/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.045626 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/ceilometer-central-agent/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.048367 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/sg-core/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.211456 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5e36763c-a3c1-424c-8982-1af635ee7100/cinder-api-log/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.400126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" event={"ID":"23243fde-3416-4769-9537-e21316883021","Type":"ContainerDied","Data":"d5be247b8386f0a26c8ed03a99ca406681526ac3e59beba95a6eb57e69cd5840"} Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.400169 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5be247b8386f0a26c8ed03a99ca406681526ac3e59beba95a6eb57e69cd5840" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.400226 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549218-ws8jp" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.430063 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5e36763c-a3c1-424c-8982-1af635ee7100/cinder-api/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.446733 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549212-9jsnh"] Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.461819 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549212-9jsnh"] Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.508234 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8fc8ad72-3a80-4520-8387-11aeb8bca94f/cinder-scheduler/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.520294 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8fc8ad72-3a80-4520-8387-11aeb8bca94f/probe/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.676631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm_fbffff61-9614-4594-b52e-be489d2b2f22/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.750259 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9_d9973d9b-7167-4a1b-9115-a7fb9a2921c3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.793199 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04f5428-7ebe-4ec3-b9aa-07c6222ff270" path="/var/lib/kubelet/pods/c04f5428-7ebe-4ec3-b9aa-07c6222ff270/volumes" Mar 08 06:58:05 crc kubenswrapper[4717]: I0308 06:58:05.875359 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/init/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.076100 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/init/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.196868 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c_099dec1f-b123-4da0-a81f-52ee1b27d5df/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.479764 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/dnsmasq-dns/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.515858 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ffc0380-502c-48b0-b36a-8421c5503fde/glance-httpd/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.627288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ffc0380-502c-48b0-b36a-8421c5503fde/glance-log/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.793364 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c715384f-21a1-490a-9432-1fef4658f5bd/glance-log/0.log" Mar 08 06:58:06 crc kubenswrapper[4717]: I0308 06:58:06.797012 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c715384f-21a1-490a-9432-1fef4658f5bd/glance-httpd/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.106003 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69bcb664dd-nb94m_9ab815c4-1b4d-499a-af69-f5e5907c9542/horizon/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.189235 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw_534637bd-8579-46f3-bee7-d6270aa8130c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.439288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5vnmd_ef634a99-41c2-496a-b06e-d697710e676d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.652341 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69bcb664dd-nb94m_9ab815c4-1b4d-499a-af69-f5e5907c9542/horizon-log/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.742232 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29549161-72gqq_8a320a32-4b25-423c-9e3c-5ca2d08652c5/keystone-cron/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.979975 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_99fa4aa1-7000-4df4-8c35-d9bf87df65f3/kube-state-metrics/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.981763 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5799fc9f64-fmph6_4270902e-1721-4286-be1f-baadb9dc68c1/keystone-api/0.log" Mar 08 06:58:07 crc kubenswrapper[4717]: I0308 06:58:07.983971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm_a99f6dd1-e80a-4191-b85a-31042a1d9fc0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.483308 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8_ced2f113-4928-44e4-a34a-3ff2a669dec6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.500320 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f758875-jsp86_7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494/neutron-httpd/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.590827 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f758875-jsp86_7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494/neutron-api/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.713401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/setup-container/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.933012 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/rabbitmq/0.log" Mar 08 06:58:08 crc kubenswrapper[4717]: I0308 06:58:08.947924 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/setup-container/0.log" Mar 08 06:58:09 crc kubenswrapper[4717]: I0308 06:58:09.551569 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_398b8cef-0b8b-4e8f-80c2-2afa74fa75be/nova-cell0-conductor-conductor/0.log" Mar 08 06:58:09 crc kubenswrapper[4717]: I0308 06:58:09.931636 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_65a7fb6f-308f-468e-8e2c-7adc53b2eb15/nova-cell1-conductor-conductor/0.log" Mar 08 06:58:10 crc kubenswrapper[4717]: I0308 06:58:10.260321 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bf251d2f-577d-4de2-ac4b-f51dc79add8d/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 06:58:10 crc kubenswrapper[4717]: I0308 06:58:10.338117 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a5123f2-ce36-401f-90d0-885684623a99/nova-api-log/0.log" Mar 08 06:58:10 crc kubenswrapper[4717]: I0308 06:58:10.474192 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rct9m_adf01f26-1066-4901-aa10-cd145a720cd6/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:10 crc kubenswrapper[4717]: I0308 06:58:10.550167 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a5123f2-ce36-401f-90d0-885684623a99/nova-api-api/0.log" Mar 08 06:58:10 crc kubenswrapper[4717]: I0308 06:58:10.643378 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5315a88e-0b28-4ad7-bc83-278711f6fb29/nova-metadata-log/0.log" Mar 08 06:58:11 crc kubenswrapper[4717]: I0308 06:58:11.052488 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/mysql-bootstrap/0.log" Mar 08 06:58:11 crc kubenswrapper[4717]: I0308 06:58:11.167088 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fea1dcce-25a3-4f13-960a-5b08bf49e521/nova-scheduler-scheduler/0.log" Mar 08 06:58:11 crc kubenswrapper[4717]: I0308 06:58:11.276647 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/galera/0.log" Mar 08 06:58:11 crc kubenswrapper[4717]: I0308 06:58:11.278877 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/mysql-bootstrap/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.095728 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/mysql-bootstrap/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.296659 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/mysql-bootstrap/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.344314 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/galera/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.475025 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9a66e3e0-63d9-4ca4-ab60-8a842f37cc68/openstackclient/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.593756 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mcfsn_52c7f6df-8563-4181-bc1e-6fb4c3dd2126/ovn-controller/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.729353 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jfvhf_047dda74-4541-43e2-bc0f-ebdd951d1dbf/openstack-network-exporter/0.log" Mar 08 06:58:12 crc kubenswrapper[4717]: I0308 06:58:12.767224 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5315a88e-0b28-4ad7-bc83-278711f6fb29/nova-metadata-metadata/0.log" Mar 08 06:58:13 crc kubenswrapper[4717]: I0308 06:58:13.154167 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server-init/0.log" Mar 08 06:58:13 crc kubenswrapper[4717]: I0308 06:58:13.874631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server-init/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.103101 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.215985 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2c94z_fed1792b-78eb-43bf-9e33-276a5b4477f7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.309998 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bde07dc5-6141-42e2-b280-d4df5ebe3d61/openstack-network-exporter/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.430645 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bde07dc5-6141-42e2-b280-d4df5ebe3d61/ovn-northd/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.433495 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovs-vswitchd/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.581645 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_353c8ab9-3710-4290-b5c4-b93339baf4da/openstack-network-exporter/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.680423 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_353c8ab9-3710-4290-b5c4-b93339baf4da/ovsdbserver-nb/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.818876 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0427b7fd-2766-4b7f-bb23-96df1b2f4f5c/openstack-network-exporter/0.log" Mar 08 06:58:14 crc kubenswrapper[4717]: I0308 06:58:14.890654 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0427b7fd-2766-4b7f-bb23-96df1b2f4f5c/ovsdbserver-sb/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.115067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c49bc6878-t8tg8_bb4d403c-6eb6-401c-9b4b-734c6adf3828/placement-api/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.181737 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/init-config-reloader/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.226450 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c49bc6878-t8tg8_bb4d403c-6eb6-401c-9b4b-734c6adf3828/placement-log/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.395922 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/init-config-reloader/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.402037 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/config-reloader/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.457269 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/thanos-sidecar/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.472535 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/prometheus/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.679467 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/setup-container/0.log" Mar 08 06:58:15 crc kubenswrapper[4717]: I0308 06:58:15.906744 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/setup-container/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.013965 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/setup-container/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.025067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/rabbitmq/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.105605 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/setup-container/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.177217 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/rabbitmq/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.278471 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp_228cb615-a265-435c-bca0-5cb037e311b6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.382720 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kqcr2_9af81be1-1bd6-46d1-ab21-d61cd769fd21/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.778060 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h_9cff8432-fc11-45e4-9e59-9abfdc356b44/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.847940 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hdxjp_e60e7e38-c8ac-4b48-bfc0-04e5b8e56874/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:16 crc kubenswrapper[4717]: I0308 06:58:16.950064 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz2kg_d4a65517-456b-4bf4-9e4b-cea94baeb6a7/ssh-known-hosts-edpm-deployment/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.197050 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-cc47695ff-btlzb_050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6/proxy-server/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.306711 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-cc47695ff-btlzb_050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6/proxy-httpd/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.309181 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wc6z2_03d2941c-7434-4961-a7ea-fdff878a1128/swift-ring-rebalance/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.384900 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-auditor/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.531116 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-reaper/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.569890 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-replicator/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.588183 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-server/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.684056 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-auditor/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.756313 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-updater/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.778347 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-server/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.834878 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-replicator/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.957824 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-auditor/0.log" Mar 08 06:58:17 crc kubenswrapper[4717]: I0308 06:58:17.970150 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-expirer/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.003457 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-replicator/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.045152 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-server/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.172611 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-updater/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.191278 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/swift-recon-cron/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.247920 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/rsync/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.451558 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj_2dabb1b7-df9b-4b70-94dc-d9e29be0856f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.539955 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0e0647cd-807a-44fc-a1e0-f5ce609b835d/tempest-tests-tempest-tests-runner/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.669578 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d536dc5d-11cf-4b1a-81bc-a17b532c9baa/test-operator-logs-container/0.log" Mar 08 06:58:18 crc kubenswrapper[4717]: I0308 06:58:18.794569 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kc72w_f1249858-ba3a-4c6e-af8a-b7784e9795a0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 06:58:19 crc kubenswrapper[4717]: I0308 06:58:19.526745 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_58162927-f626-43d8-a792-507cf584db78/watcher-applier/0.log" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.061829 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:20 crc kubenswrapper[4717]: E0308 06:58:20.062537 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23243fde-3416-4769-9537-e21316883021" containerName="oc" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.062555 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="23243fde-3416-4769-9537-e21316883021" containerName="oc" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.062955 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="23243fde-3416-4769-9537-e21316883021" containerName="oc" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.065437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.083029 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.135356 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_acde9c29-0910-40cc-9da8-06a566c67b4c/watcher-api-log/0.log" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.188433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.188508 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-746nd\" (UniqueName: \"kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.188605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.291715 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.291803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.292567 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.292605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.293106 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-746nd\" (UniqueName: \"kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:20 crc kubenswrapper[4717]: I0308 06:58:20.718743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-746nd\" (UniqueName: \"kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd\") pod \"certified-operators-wrjx9\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:21 crc kubenswrapper[4717]: I0308 06:58:21.004117 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:21 crc kubenswrapper[4717]: I0308 06:58:21.513408 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:21 crc kubenswrapper[4717]: I0308 06:58:21.573231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerStarted","Data":"328737832f3e6d1a48d3c1eb6eaa4b15cc391256c0291414b2d155a2680f3d35"} Mar 08 06:58:22 crc kubenswrapper[4717]: I0308 06:58:22.582526 4717 generic.go:334] "Generic (PLEG): container finished" podID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerID="b5dc8b0975d474fb5f54188e43e3685301bd4ed7a76c3895d6adc7b846ec9ed1" exitCode=0 Mar 08 06:58:22 crc kubenswrapper[4717]: I0308 06:58:22.582788 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerDied","Data":"b5dc8b0975d474fb5f54188e43e3685301bd4ed7a76c3895d6adc7b846ec9ed1"} Mar 08 06:58:22 crc kubenswrapper[4717]: I0308 06:58:22.782768 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_e1b3240f-d4a8-409e-a8bf-a2f2d03ac126/watcher-decision-engine/0.log" Mar 08 06:58:24 crc kubenswrapper[4717]: I0308 06:58:24.048618 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_acde9c29-0910-40cc-9da8-06a566c67b4c/watcher-api/0.log" Mar 08 06:58:24 crc kubenswrapper[4717]: I0308 06:58:24.601372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerStarted","Data":"6105b73bf5332deb8ec7d982d5decadcf8af54ed5f7fc8c6eca4bde2a4c310bc"} Mar 08 06:58:27 crc kubenswrapper[4717]: I0308 06:58:27.651218 4717 generic.go:334] "Generic (PLEG): container finished" podID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerID="6105b73bf5332deb8ec7d982d5decadcf8af54ed5f7fc8c6eca4bde2a4c310bc" exitCode=0 Mar 08 06:58:27 crc kubenswrapper[4717]: I0308 06:58:27.651640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerDied","Data":"6105b73bf5332deb8ec7d982d5decadcf8af54ed5f7fc8c6eca4bde2a4c310bc"} Mar 08 06:58:28 crc kubenswrapper[4717]: I0308 06:58:28.662086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerStarted","Data":"7383c33a941bc11de172da9a0b4cbc003b2559627715da6b52b4d1331cad7020"} Mar 08 06:58:28 crc kubenswrapper[4717]: I0308 06:58:28.686768 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wrjx9" podStartSLOduration=3.169271171 podStartE2EDuration="8.686748779s" podCreationTimestamp="2026-03-08 06:58:20 +0000 UTC" firstStartedPulling="2026-03-08 06:58:22.584658155 +0000 UTC m=+5529.502306999" lastFinishedPulling="2026-03-08 06:58:28.102135763 +0000 UTC m=+5535.019784607" observedRunningTime="2026-03-08 06:58:28.686465562 +0000 UTC m=+5535.604114416" watchObservedRunningTime="2026-03-08 06:58:28.686748779 +0000 UTC m=+5535.604397623" Mar 08 06:58:31 crc kubenswrapper[4717]: I0308 06:58:31.004826 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:31 crc kubenswrapper[4717]: I0308 06:58:31.006026 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:31 crc kubenswrapper[4717]: I0308 06:58:31.056455 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:32 crc kubenswrapper[4717]: I0308 06:58:32.760811 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ee8a4411-d973-4eeb-b6cd-eb0844e7826e/memcached/0.log" Mar 08 06:58:34 crc kubenswrapper[4717]: I0308 06:58:34.120491 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:58:34 crc kubenswrapper[4717]: I0308 06:58:34.120553 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:58:41 crc kubenswrapper[4717]: I0308 06:58:41.069183 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:41 crc kubenswrapper[4717]: I0308 06:58:41.145873 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:41 crc kubenswrapper[4717]: I0308 06:58:41.772832 4717 scope.go:117] "RemoveContainer" containerID="975d5dcf0cfbc7dbc35c3ca4c7d2b0d4e8bd14cfaf3dee0ec6acf3cbed0230ba" Mar 08 06:58:41 crc kubenswrapper[4717]: I0308 06:58:41.787338 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wrjx9" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="registry-server" containerID="cri-o://7383c33a941bc11de172da9a0b4cbc003b2559627715da6b52b4d1331cad7020" gracePeriod=2 Mar 08 06:58:42 crc kubenswrapper[4717]: I0308 06:58:42.805226 4717 generic.go:334] "Generic (PLEG): container finished" podID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerID="7383c33a941bc11de172da9a0b4cbc003b2559627715da6b52b4d1331cad7020" exitCode=0 Mar 08 06:58:42 crc kubenswrapper[4717]: I0308 06:58:42.805270 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerDied","Data":"7383c33a941bc11de172da9a0b4cbc003b2559627715da6b52b4d1331cad7020"} Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.019593 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.146774 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content\") pod \"136af4de-2abf-4294-8343-eeafd1cfc5a2\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.146846 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-746nd\" (UniqueName: \"kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd\") pod \"136af4de-2abf-4294-8343-eeafd1cfc5a2\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.147023 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities\") pod \"136af4de-2abf-4294-8343-eeafd1cfc5a2\" (UID: \"136af4de-2abf-4294-8343-eeafd1cfc5a2\") " Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.147932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities" (OuterVolumeSpecName: "utilities") pod "136af4de-2abf-4294-8343-eeafd1cfc5a2" (UID: "136af4de-2abf-4294-8343-eeafd1cfc5a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.155017 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd" (OuterVolumeSpecName: "kube-api-access-746nd") pod "136af4de-2abf-4294-8343-eeafd1cfc5a2" (UID: "136af4de-2abf-4294-8343-eeafd1cfc5a2"). InnerVolumeSpecName "kube-api-access-746nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.205821 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "136af4de-2abf-4294-8343-eeafd1cfc5a2" (UID: "136af4de-2abf-4294-8343-eeafd1cfc5a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.249782 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.249828 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-746nd\" (UniqueName: \"kubernetes.io/projected/136af4de-2abf-4294-8343-eeafd1cfc5a2-kube-api-access-746nd\") on node \"crc\" DevicePath \"\"" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.249845 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136af4de-2abf-4294-8343-eeafd1cfc5a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.822831 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrjx9" event={"ID":"136af4de-2abf-4294-8343-eeafd1cfc5a2","Type":"ContainerDied","Data":"328737832f3e6d1a48d3c1eb6eaa4b15cc391256c0291414b2d155a2680f3d35"} Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.823192 4717 scope.go:117] "RemoveContainer" containerID="7383c33a941bc11de172da9a0b4cbc003b2559627715da6b52b4d1331cad7020" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.822943 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrjx9" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.860764 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.863393 4717 scope.go:117] "RemoveContainer" containerID="6105b73bf5332deb8ec7d982d5decadcf8af54ed5f7fc8c6eca4bde2a4c310bc" Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.882610 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wrjx9"] Mar 08 06:58:43 crc kubenswrapper[4717]: I0308 06:58:43.890448 4717 scope.go:117] "RemoveContainer" containerID="b5dc8b0975d474fb5f54188e43e3685301bd4ed7a76c3895d6adc7b846ec9ed1" Mar 08 06:58:45 crc kubenswrapper[4717]: I0308 06:58:45.796259 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" path="/var/lib/kubelet/pods/136af4de-2abf-4294-8343-eeafd1cfc5a2/volumes" Mar 08 06:58:52 crc kubenswrapper[4717]: I0308 06:58:52.560844 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 06:58:52 crc kubenswrapper[4717]: I0308 06:58:52.777507 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 06:58:52 crc kubenswrapper[4717]: I0308 06:58:52.798297 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 06:58:52 crc kubenswrapper[4717]: I0308 06:58:52.807999 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 06:58:53 crc kubenswrapper[4717]: I0308 06:58:53.012991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/extract/0.log" Mar 08 06:58:53 crc kubenswrapper[4717]: I0308 06:58:53.024313 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 06:58:53 crc kubenswrapper[4717]: I0308 06:58:53.070807 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 06:58:53 crc kubenswrapper[4717]: I0308 06:58:53.998920 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-wdwbm_bf98a4b8-6e3c-423d-b228-347c527e6721/manager/0.log" Mar 08 06:58:54 crc kubenswrapper[4717]: I0308 06:58:54.385640 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-vbdls_11010a39-3786-472f-ad04-805c35647afc/manager/0.log" Mar 08 06:58:54 crc kubenswrapper[4717]: I0308 06:58:54.706539 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-vpqhf_74fc8d21-150d-4009-b0ba-b6a47db5adbb/manager/0.log" Mar 08 06:58:55 crc kubenswrapper[4717]: I0308 06:58:55.030382 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-s8gsj_9fe70b75-885a-402b-98e1-f5c696e47f48/manager/0.log" Mar 08 06:58:55 crc kubenswrapper[4717]: I0308 06:58:55.623917 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-x44f7_9c64b2a6-6663-46cc-b762-bffa01baeb47/manager/0.log" Mar 08 06:58:55 crc kubenswrapper[4717]: I0308 06:58:55.944588 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vrnlx_be14026d-4e86-4134-8f2a-617e9272d2a1/manager/0.log" Mar 08 06:58:55 crc kubenswrapper[4717]: I0308 06:58:55.990205 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-x5kbt_8f3bb097-82e6-4fe8-ad89-48004c80477b/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.015839 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wmjsb_7668ece6-7b88-4707-baf2-62379071cf43/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.144102 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-mp5dj_7eda0f52-4fcf-46fe-b329-075fb4d79c74/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.293418 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-959nd_148c1a2c-7098-4111-a12e-02e2dcc295a6/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.478780 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-hl7k4_999e5f1a-4be7-4716-8999-e28027c618b9/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.648012 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-htv95_17485954-f1e6-4042-9338-ad5115801764/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.808649 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-cmn95_44e5de82-d168-400e-801f-1f122a08c656/manager/0.log" Mar 08 06:58:56 crc kubenswrapper[4717]: I0308 06:58:56.847339 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n_f09b3f70-1158-4269-abf3-acf3fecc0cb9/manager/0.log" Mar 08 06:58:57 crc kubenswrapper[4717]: I0308 06:58:57.160616 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-4xgmb_003c1f39-7ea2-4391-87f9-875cbdf6e1cc/operator/0.log" Mar 08 06:58:57 crc kubenswrapper[4717]: I0308 06:58:57.441150 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-dfmch_3cd0ad0a-7a9e-4870-8b76-58f975cd36e4/manager/0.log" Mar 08 06:58:57 crc kubenswrapper[4717]: I0308 06:58:57.443719 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bdxpf_33dbebe0-8cce-49d5-afc5-287c2c188438/registry-server/0.log" Mar 08 06:58:57 crc kubenswrapper[4717]: I0308 06:58:57.688599 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-bv6fm_ae7df2ae-3ad9-4c73-a957-fe35b87703ec/manager/0.log" Mar 08 06:58:58 crc kubenswrapper[4717]: I0308 06:58:58.157048 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4bqgl_7da8d6da-69ae-4351-a774-20888648eac2/operator/0.log" Mar 08 06:58:58 crc kubenswrapper[4717]: I0308 06:58:58.186245 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-djsmm_d7c1a0d3-1242-402f-88a3-6d45d4c6661a/manager/0.log" Mar 08 06:58:58 crc kubenswrapper[4717]: I0308 06:58:58.510848 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-cvjxp_4a173035-b1d9-4435-a2d1-b29e9bea39be/manager/0.log" Mar 08 06:58:58 crc kubenswrapper[4717]: I0308 06:58:58.713746 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-6hf5q_3689217b-f2db-4d81-8e68-7f728ce20860/manager/0.log" Mar 08 06:58:58 crc kubenswrapper[4717]: I0308 06:58:58.979604 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-hbqjn_56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5/manager/0.log" Mar 08 06:58:59 crc kubenswrapper[4717]: I0308 06:58:59.001358 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-8wwfb_1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b/manager/0.log" Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.119495 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.119963 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.120021 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.121889 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.121973 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2" gracePeriod=600 Mar 08 06:59:04 crc kubenswrapper[4717]: I0308 06:59:04.197769 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-7x79h_8e10706f-2cf2-4b11-a084-33df5b7fe0a1/manager/0.log" Mar 08 06:59:05 crc kubenswrapper[4717]: I0308 06:59:05.074298 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2" exitCode=0 Mar 08 06:59:05 crc kubenswrapper[4717]: I0308 06:59:05.074370 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2"} Mar 08 06:59:05 crc kubenswrapper[4717]: I0308 06:59:05.074983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f"} Mar 08 06:59:05 crc kubenswrapper[4717]: I0308 06:59:05.075010 4717 scope.go:117] "RemoveContainer" containerID="5ab2c9423c6c215999eb85a1b41bd7fe1e7056a27f3a75d1081ae2d261303fc2" Mar 08 06:59:21 crc kubenswrapper[4717]: I0308 06:59:21.455899 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hhrcp_0507da4e-a2d5-43c2-b5e2-25f42085431c/control-plane-machine-set-operator/0.log" Mar 08 06:59:21 crc kubenswrapper[4717]: I0308 06:59:21.653477 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ppg2t_4ca60946-75d5-469e-84f0-d200ca8c0cfd/machine-api-operator/0.log" Mar 08 06:59:21 crc kubenswrapper[4717]: I0308 06:59:21.699877 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ppg2t_4ca60946-75d5-469e-84f0-d200ca8c0cfd/kube-rbac-proxy/0.log" Mar 08 06:59:36 crc kubenswrapper[4717]: I0308 06:59:36.480103 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-q9jgr_af96d97e-e051-406c-b0f5-c9d59fb60bfa/cert-manager-controller/0.log" Mar 08 06:59:36 crc kubenswrapper[4717]: I0308 06:59:36.716438 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-w5ms2_1e9abf00-821a-412c-b6da-fa5c1f1a568a/cert-manager-cainjector/0.log" Mar 08 06:59:36 crc kubenswrapper[4717]: I0308 06:59:36.737521 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-z8plj_8815e0c5-e3aa-4015-95c2-e2091a21ef2f/cert-manager-webhook/0.log" Mar 08 06:59:51 crc kubenswrapper[4717]: I0308 06:59:51.471097 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-gcpqm_a99ee055-c0a9-4a9b-8787-45f90f0e41f0/nmstate-console-plugin/0.log" Mar 08 06:59:51 crc kubenswrapper[4717]: I0308 06:59:51.674077 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-57gbg_18fa48fe-7964-43d4-8e35-f0e459dd40ea/nmstate-handler/0.log" Mar 08 06:59:51 crc kubenswrapper[4717]: I0308 06:59:51.766244 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-nvc4p_474b5a28-e5de-4fdc-814e-588f604686f4/kube-rbac-proxy/0.log" Mar 08 06:59:51 crc kubenswrapper[4717]: I0308 06:59:51.856550 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-nvc4p_474b5a28-e5de-4fdc-814e-588f604686f4/nmstate-metrics/0.log" Mar 08 06:59:51 crc kubenswrapper[4717]: I0308 06:59:51.948906 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-qt88p_396ff7f1-399f-4510-96a1-d17996841dba/nmstate-operator/0.log" Mar 08 06:59:52 crc kubenswrapper[4717]: I0308 06:59:52.590574 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-w6x5m_3e38f069-dcb2-471a-9124-87af836a0e11/nmstate-webhook/0.log" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.159838 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549220-ztvdz"] Mar 08 07:00:00 crc kubenswrapper[4717]: E0308 07:00:00.160761 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="extract-utilities" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.160774 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="extract-utilities" Mar 08 07:00:00 crc kubenswrapper[4717]: E0308 07:00:00.160814 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="extract-content" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.160819 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="extract-content" Mar 08 07:00:00 crc kubenswrapper[4717]: E0308 07:00:00.160843 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="registry-server" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.160850 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="registry-server" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.161025 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="136af4de-2abf-4294-8343-eeafd1cfc5a2" containerName="registry-server" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.161676 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.164644 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.165180 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.170481 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd"] Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.172423 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.174305 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.174760 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.175308 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.180784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549220-ztvdz"] Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.191025 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd"] Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.330844 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.330893 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbtrv\" (UniqueName: \"kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.331112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.331665 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmzx7\" (UniqueName: \"kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7\") pod \"auto-csr-approver-29549220-ztvdz\" (UID: \"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd\") " pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.433885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmzx7\" (UniqueName: \"kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7\") pod \"auto-csr-approver-29549220-ztvdz\" (UID: \"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd\") " pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.434292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.434322 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbtrv\" (UniqueName: \"kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.435118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.435241 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.449720 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.461353 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbtrv\" (UniqueName: \"kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv\") pod \"collect-profiles-29549220-q24bd\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.462236 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmzx7\" (UniqueName: \"kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7\") pod \"auto-csr-approver-29549220-ztvdz\" (UID: \"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd\") " pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.484917 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:00 crc kubenswrapper[4717]: I0308 07:00:00.499572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.067442 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549220-ztvdz"] Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.162238 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd"] Mar 08 07:00:01 crc kubenswrapper[4717]: W0308 07:00:01.162460 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd269154_6322_47d4_9cb4_09d940c8516e.slice/crio-884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9 WatchSource:0}: Error finding container 884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9: Status 404 returned error can't find the container with id 884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9 Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.660552 4717 generic.go:334] "Generic (PLEG): container finished" podID="fd269154-6322-47d4-9cb4-09d940c8516e" containerID="0feafa0012687137a41e7843115c94dc2da0a10be6f56f7571ad202d880a4153" exitCode=0 Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.660647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" event={"ID":"fd269154-6322-47d4-9cb4-09d940c8516e","Type":"ContainerDied","Data":"0feafa0012687137a41e7843115c94dc2da0a10be6f56f7571ad202d880a4153"} Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.661047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" event={"ID":"fd269154-6322-47d4-9cb4-09d940c8516e","Type":"ContainerStarted","Data":"884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9"} Mar 08 07:00:01 crc kubenswrapper[4717]: I0308 07:00:01.662078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" event={"ID":"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd","Type":"ContainerStarted","Data":"4df2ff0de8acb4b229316159500b924ed9df54da3d67de1aae72f0d6192356f7"} Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.132708 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.296310 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbtrv\" (UniqueName: \"kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv\") pod \"fd269154-6322-47d4-9cb4-09d940c8516e\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.296381 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume\") pod \"fd269154-6322-47d4-9cb4-09d940c8516e\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.296578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume\") pod \"fd269154-6322-47d4-9cb4-09d940c8516e\" (UID: \"fd269154-6322-47d4-9cb4-09d940c8516e\") " Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.297120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume" (OuterVolumeSpecName: "config-volume") pod "fd269154-6322-47d4-9cb4-09d940c8516e" (UID: "fd269154-6322-47d4-9cb4-09d940c8516e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.297735 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd269154-6322-47d4-9cb4-09d940c8516e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.302863 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv" (OuterVolumeSpecName: "kube-api-access-lbtrv") pod "fd269154-6322-47d4-9cb4-09d940c8516e" (UID: "fd269154-6322-47d4-9cb4-09d940c8516e"). InnerVolumeSpecName "kube-api-access-lbtrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.320074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fd269154-6322-47d4-9cb4-09d940c8516e" (UID: "fd269154-6322-47d4-9cb4-09d940c8516e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.399963 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd269154-6322-47d4-9cb4-09d940c8516e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.400010 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbtrv\" (UniqueName: \"kubernetes.io/projected/fd269154-6322-47d4-9cb4-09d940c8516e-kube-api-access-lbtrv\") on node \"crc\" DevicePath \"\"" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.679110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" event={"ID":"fd269154-6322-47d4-9cb4-09d940c8516e","Type":"ContainerDied","Data":"884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9"} Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.679421 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884a915576f0fe19aae70cb9592a95f453c85ee716d2d4f9daa6387a961126f9" Mar 08 07:00:03 crc kubenswrapper[4717]: I0308 07:00:03.679180 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549220-q24bd" Mar 08 07:00:04 crc kubenswrapper[4717]: I0308 07:00:04.218186 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm"] Mar 08 07:00:04 crc kubenswrapper[4717]: I0308 07:00:04.233475 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549175-df2jm"] Mar 08 07:00:05 crc kubenswrapper[4717]: I0308 07:00:05.698606 4717 generic.go:334] "Generic (PLEG): container finished" podID="1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" containerID="2270a8149ef74bd357dff94b9661a23fa028748183aeeb1eb926bee96a538cca" exitCode=0 Mar 08 07:00:05 crc kubenswrapper[4717]: I0308 07:00:05.699132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" event={"ID":"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd","Type":"ContainerDied","Data":"2270a8149ef74bd357dff94b9661a23fa028748183aeeb1eb926bee96a538cca"} Mar 08 07:00:05 crc kubenswrapper[4717]: I0308 07:00:05.795589 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a9fed6-7639-4038-bc62-0b0ec30c1772" path="/var/lib/kubelet/pods/72a9fed6-7639-4038-bc62-0b0ec30c1772/volumes" Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.108255 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.194946 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmzx7\" (UniqueName: \"kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7\") pod \"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd\" (UID: \"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd\") " Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.204630 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7" (OuterVolumeSpecName: "kube-api-access-wmzx7") pod "1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" (UID: "1f67f15a-7766-4e30-b56b-d4bdc49e5cdd"). InnerVolumeSpecName "kube-api-access-wmzx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.297279 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmzx7\" (UniqueName: \"kubernetes.io/projected/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd-kube-api-access-wmzx7\") on node \"crc\" DevicePath \"\"" Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.718981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" event={"ID":"1f67f15a-7766-4e30-b56b-d4bdc49e5cdd","Type":"ContainerDied","Data":"4df2ff0de8acb4b229316159500b924ed9df54da3d67de1aae72f0d6192356f7"} Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.719017 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4df2ff0de8acb4b229316159500b924ed9df54da3d67de1aae72f0d6192356f7" Mar 08 07:00:07 crc kubenswrapper[4717]: I0308 07:00:07.719372 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549220-ztvdz" Mar 08 07:00:08 crc kubenswrapper[4717]: I0308 07:00:08.170920 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549214-sz8rk"] Mar 08 07:00:08 crc kubenswrapper[4717]: I0308 07:00:08.180474 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549214-sz8rk"] Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.026887 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9mtbk_186bbba6-72b1-4834-9f78-65c0099a8be8/prometheus-operator/0.log" Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.195734 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp_dad3a63c-7244-41bd-85d4-38046d2ecf3f/prometheus-operator-admission-webhook/0.log" Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.285498 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd_0e21fc32-f762-4f29-9ed5-7ab0e28be6a7/prometheus-operator-admission-webhook/0.log" Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.437370 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pjlrw_5eaaec5c-b81f-4400-8237-3cb96bac6a73/operator/0.log" Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.505028 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hmpm8_6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5/perses-operator/0.log" Mar 08 07:00:09 crc kubenswrapper[4717]: I0308 07:00:09.794761 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3846f96d-edbf-4b3e-b9db-8abd444b27b9" path="/var/lib/kubelet/pods/3846f96d-edbf-4b3e-b9db-8abd444b27b9/volumes" Mar 08 07:00:24 crc kubenswrapper[4717]: I0308 07:00:24.692196 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-96ksw_65f01f58-dbf5-4547-9249-ab613d4f85db/kube-rbac-proxy/0.log" Mar 08 07:00:24 crc kubenswrapper[4717]: I0308 07:00:24.768948 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-96ksw_65f01f58-dbf5-4547-9249-ab613d4f85db/controller/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.109737 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-vwtjg_88b4e6f2-24f0-4e67-ab40-e3621ab5b44f/frr-k8s-webhook-server/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.169726 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.375863 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.396738 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.399733 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.401071 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.566391 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.579817 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.603634 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.604024 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.752108 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.770404 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/controller/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.771820 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.822748 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.949299 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/frr-metrics/0.log" Mar 08 07:00:25 crc kubenswrapper[4717]: I0308 07:00:25.963125 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/kube-rbac-proxy/0.log" Mar 08 07:00:26 crc kubenswrapper[4717]: I0308 07:00:26.051353 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/kube-rbac-proxy-frr/0.log" Mar 08 07:00:26 crc kubenswrapper[4717]: I0308 07:00:26.182126 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/reloader/0.log" Mar 08 07:00:26 crc kubenswrapper[4717]: I0308 07:00:26.282599 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d59c89549-fbpjz_de9b0a52-bf0f-4566-bcb4-f52c31916a41/manager/0.log" Mar 08 07:00:26 crc kubenswrapper[4717]: I0308 07:00:26.386053 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f74747698-24c78_3114eda7-af43-45d9-955c-116f643af398/webhook-server/0.log" Mar 08 07:00:27 crc kubenswrapper[4717]: I0308 07:00:27.121247 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27q99_b4a0e98d-c9c3-4d97-ab3a-cd63903fd104/kube-rbac-proxy/0.log" Mar 08 07:00:27 crc kubenswrapper[4717]: I0308 07:00:27.604737 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27q99_b4a0e98d-c9c3-4d97-ab3a-cd63903fd104/speaker/0.log" Mar 08 07:00:28 crc kubenswrapper[4717]: I0308 07:00:28.013935 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/frr/0.log" Mar 08 07:00:40 crc kubenswrapper[4717]: I0308 07:00:40.746594 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:00:40 crc kubenswrapper[4717]: I0308 07:00:40.933070 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:00:40 crc kubenswrapper[4717]: I0308 07:00:40.966461 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:00:40 crc kubenswrapper[4717]: I0308 07:00:40.991715 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.170246 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.199095 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.226500 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/extract/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.340512 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.526586 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.528539 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.529270 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.694053 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.727348 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.746735 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/extract/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.886521 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.948590 4717 scope.go:117] "RemoveContainer" containerID="424b8ac215b5a68f15fcef7c64a1052c4ce7fd010b1d09ddc06daf94bc750620" Mar 08 07:00:41 crc kubenswrapper[4717]: I0308 07:00:41.982305 4717 scope.go:117] "RemoveContainer" containerID="b5ad4c57144b959b90f11f4c6b639ac217c5d8491716dd26da0ebc22be04b5df" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.083563 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.115189 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.183360 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.247579 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.274924 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.443174 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.746289 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.761746 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.865340 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.886631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/registry-server/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.969257 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:00:42 crc kubenswrapper[4717]: I0308 07:00:42.969413 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.207914 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.371455 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.379347 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.465246 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.483728 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/registry-server/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.610715 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.622360 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.651186 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/extract/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.809825 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cmqxx_61147cf3-b98d-4c9f-a053-2d818468c5e0/marketplace-operator/0.log" Mar 08 07:00:43 crc kubenswrapper[4717]: I0308 07:00:43.844177 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.086224 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.093328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.104345 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.301885 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.304612 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.515840 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.516345 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/registry-server/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.658879 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.698117 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.698236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.857453 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:00:44 crc kubenswrapper[4717]: I0308 07:00:44.861991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:00:45 crc kubenswrapper[4717]: I0308 07:00:45.525425 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/registry-server/0.log" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.158515 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29549221-mwzz7"] Mar 08 07:01:00 crc kubenswrapper[4717]: E0308 07:01:00.159482 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" containerName="oc" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.159497 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" containerName="oc" Mar 08 07:01:00 crc kubenswrapper[4717]: E0308 07:01:00.159527 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd269154-6322-47d4-9cb4-09d940c8516e" containerName="collect-profiles" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.159535 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd269154-6322-47d4-9cb4-09d940c8516e" containerName="collect-profiles" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.159759 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd269154-6322-47d4-9cb4-09d940c8516e" containerName="collect-profiles" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.159792 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" containerName="oc" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.160484 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.177587 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549221-mwzz7"] Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.232537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxwx7\" (UniqueName: \"kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.232864 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.232902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.233023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.334273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.334393 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxwx7\" (UniqueName: \"kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.334419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.334447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.341285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.346744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.352035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.354885 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxwx7\" (UniqueName: \"kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7\") pod \"keystone-cron-29549221-mwzz7\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.485217 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:00 crc kubenswrapper[4717]: I0308 07:01:00.948293 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549221-mwzz7"] Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.164227 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9mtbk_186bbba6-72b1-4834-9f78-65c0099a8be8/prometheus-operator/0.log" Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.203536 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp_dad3a63c-7244-41bd-85d4-38046d2ecf3f/prometheus-operator-admission-webhook/0.log" Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.295588 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549221-mwzz7" event={"ID":"3c258b2f-d522-4be9-a894-01fdd5289ebe","Type":"ContainerStarted","Data":"4033f148a8300c3f02c1db2095bb641304a85d1877503259718401842ed524fd"} Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.295631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549221-mwzz7" event={"ID":"3c258b2f-d522-4be9-a894-01fdd5289ebe","Type":"ContainerStarted","Data":"c04bfac470f0b81d7f2ff31a2932ea0ee83d797aa98d3491e5e98cd35a5f8d72"} Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.297254 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd_0e21fc32-f762-4f29-9ed5-7ab0e28be6a7/prometheus-operator-admission-webhook/0.log" Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.321658 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29549221-mwzz7" podStartSLOduration=1.3216425649999999 podStartE2EDuration="1.321642565s" podCreationTimestamp="2026-03-08 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 07:01:01.319920363 +0000 UTC m=+5688.237569207" watchObservedRunningTime="2026-03-08 07:01:01.321642565 +0000 UTC m=+5688.239291409" Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.459174 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hmpm8_6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5/perses-operator/0.log" Mar 08 07:01:01 crc kubenswrapper[4717]: I0308 07:01:01.491651 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pjlrw_5eaaec5c-b81f-4400-8237-3cb96bac6a73/operator/0.log" Mar 08 07:01:04 crc kubenswrapper[4717]: I0308 07:01:04.120346 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:01:04 crc kubenswrapper[4717]: I0308 07:01:04.121712 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:01:05 crc kubenswrapper[4717]: I0308 07:01:05.330013 4717 generic.go:334] "Generic (PLEG): container finished" podID="3c258b2f-d522-4be9-a894-01fdd5289ebe" containerID="4033f148a8300c3f02c1db2095bb641304a85d1877503259718401842ed524fd" exitCode=0 Mar 08 07:01:05 crc kubenswrapper[4717]: I0308 07:01:05.330067 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549221-mwzz7" event={"ID":"3c258b2f-d522-4be9-a894-01fdd5289ebe","Type":"ContainerDied","Data":"4033f148a8300c3f02c1db2095bb641304a85d1877503259718401842ed524fd"} Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.807044 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.967716 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle\") pod \"3c258b2f-d522-4be9-a894-01fdd5289ebe\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.967802 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys\") pod \"3c258b2f-d522-4be9-a894-01fdd5289ebe\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.967848 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data\") pod \"3c258b2f-d522-4be9-a894-01fdd5289ebe\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.967936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxwx7\" (UniqueName: \"kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7\") pod \"3c258b2f-d522-4be9-a894-01fdd5289ebe\" (UID: \"3c258b2f-d522-4be9-a894-01fdd5289ebe\") " Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.974837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3c258b2f-d522-4be9-a894-01fdd5289ebe" (UID: "3c258b2f-d522-4be9-a894-01fdd5289ebe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 07:01:06 crc kubenswrapper[4717]: I0308 07:01:06.975362 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7" (OuterVolumeSpecName: "kube-api-access-qxwx7") pod "3c258b2f-d522-4be9-a894-01fdd5289ebe" (UID: "3c258b2f-d522-4be9-a894-01fdd5289ebe"). InnerVolumeSpecName "kube-api-access-qxwx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.038776 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c258b2f-d522-4be9-a894-01fdd5289ebe" (UID: "3c258b2f-d522-4be9-a894-01fdd5289ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.069967 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxwx7\" (UniqueName: \"kubernetes.io/projected/3c258b2f-d522-4be9-a894-01fdd5289ebe-kube-api-access-qxwx7\") on node \"crc\" DevicePath \"\"" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.069993 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.070002 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.093853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data" (OuterVolumeSpecName: "config-data") pod "3c258b2f-d522-4be9-a894-01fdd5289ebe" (UID: "3c258b2f-d522-4be9-a894-01fdd5289ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.171446 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c258b2f-d522-4be9-a894-01fdd5289ebe-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.351903 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549221-mwzz7" event={"ID":"3c258b2f-d522-4be9-a894-01fdd5289ebe","Type":"ContainerDied","Data":"c04bfac470f0b81d7f2ff31a2932ea0ee83d797aa98d3491e5e98cd35a5f8d72"} Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.351935 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04bfac470f0b81d7f2ff31a2932ea0ee83d797aa98d3491e5e98cd35a5f8d72" Mar 08 07:01:07 crc kubenswrapper[4717]: I0308 07:01:07.351963 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549221-mwzz7" Mar 08 07:01:34 crc kubenswrapper[4717]: I0308 07:01:34.120142 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:01:34 crc kubenswrapper[4717]: I0308 07:01:34.120779 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.153799 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549222-h24gp"] Mar 08 07:02:00 crc kubenswrapper[4717]: E0308 07:02:00.155103 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c258b2f-d522-4be9-a894-01fdd5289ebe" containerName="keystone-cron" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.155126 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c258b2f-d522-4be9-a894-01fdd5289ebe" containerName="keystone-cron" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.155425 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c258b2f-d522-4be9-a894-01fdd5289ebe" containerName="keystone-cron" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.156547 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.160140 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.160282 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.161065 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.169347 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549222-h24gp"] Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.218918 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2xw9\" (UniqueName: \"kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9\") pod \"auto-csr-approver-29549222-h24gp\" (UID: \"9714ba72-0ac4-47ac-9afc-4f62b43ccf38\") " pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.321330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2xw9\" (UniqueName: \"kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9\") pod \"auto-csr-approver-29549222-h24gp\" (UID: \"9714ba72-0ac4-47ac-9afc-4f62b43ccf38\") " pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.347395 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2xw9\" (UniqueName: \"kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9\") pod \"auto-csr-approver-29549222-h24gp\" (UID: \"9714ba72-0ac4-47ac-9afc-4f62b43ccf38\") " pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:00 crc kubenswrapper[4717]: I0308 07:02:00.476781 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:01 crc kubenswrapper[4717]: I0308 07:02:01.068912 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549222-h24gp"] Mar 08 07:02:01 crc kubenswrapper[4717]: I0308 07:02:01.992227 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549222-h24gp" event={"ID":"9714ba72-0ac4-47ac-9afc-4f62b43ccf38","Type":"ContainerStarted","Data":"d1ccaa4f095b9af05b3bfc329d8fc1c9755442c21bdf792476caf268a5f34e95"} Mar 08 07:02:03 crc kubenswrapper[4717]: I0308 07:02:03.021287 4717 generic.go:334] "Generic (PLEG): container finished" podID="9714ba72-0ac4-47ac-9afc-4f62b43ccf38" containerID="f304960f7f7ca3418138cdf14afd5033741672e19ef5255cf46230a950af72c4" exitCode=0 Mar 08 07:02:03 crc kubenswrapper[4717]: I0308 07:02:03.021397 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549222-h24gp" event={"ID":"9714ba72-0ac4-47ac-9afc-4f62b43ccf38","Type":"ContainerDied","Data":"f304960f7f7ca3418138cdf14afd5033741672e19ef5255cf46230a950af72c4"} Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.126264 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.126678 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.126753 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.127627 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.127721 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" gracePeriod=600 Mar 08 07:02:04 crc kubenswrapper[4717]: E0308 07:02:04.283532 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.495214 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.619792 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2xw9\" (UniqueName: \"kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9\") pod \"9714ba72-0ac4-47ac-9afc-4f62b43ccf38\" (UID: \"9714ba72-0ac4-47ac-9afc-4f62b43ccf38\") " Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.627463 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9" (OuterVolumeSpecName: "kube-api-access-m2xw9") pod "9714ba72-0ac4-47ac-9afc-4f62b43ccf38" (UID: "9714ba72-0ac4-47ac-9afc-4f62b43ccf38"). InnerVolumeSpecName "kube-api-access-m2xw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:02:04 crc kubenswrapper[4717]: I0308 07:02:04.722793 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2xw9\" (UniqueName: \"kubernetes.io/projected/9714ba72-0ac4-47ac-9afc-4f62b43ccf38-kube-api-access-m2xw9\") on node \"crc\" DevicePath \"\"" Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.047270 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" exitCode=0 Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.047368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f"} Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.047653 4717 scope.go:117] "RemoveContainer" containerID="7bb4ca3169b930f0d41e9232d9c0d19998dde248047150051071c9eed9222ab2" Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.048309 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:02:05 crc kubenswrapper[4717]: E0308 07:02:05.048596 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.050251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549222-h24gp" event={"ID":"9714ba72-0ac4-47ac-9afc-4f62b43ccf38","Type":"ContainerDied","Data":"d1ccaa4f095b9af05b3bfc329d8fc1c9755442c21bdf792476caf268a5f34e95"} Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.050271 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549222-h24gp" Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.050278 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ccaa4f095b9af05b3bfc329d8fc1c9755442c21bdf792476caf268a5f34e95" Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.597304 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549216-xz7dn"] Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.615902 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549216-xz7dn"] Mar 08 07:02:05 crc kubenswrapper[4717]: I0308 07:02:05.799873 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e1d41e-d3f2-4832-920d-38d43f4ef25f" path="/var/lib/kubelet/pods/07e1d41e-d3f2-4832-920d-38d43f4ef25f/volumes" Mar 08 07:02:16 crc kubenswrapper[4717]: I0308 07:02:16.783541 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:02:16 crc kubenswrapper[4717]: E0308 07:02:16.784725 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:02:30 crc kubenswrapper[4717]: I0308 07:02:30.782499 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:02:30 crc kubenswrapper[4717]: E0308 07:02:30.783350 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:02:42 crc kubenswrapper[4717]: I0308 07:02:42.132674 4717 scope.go:117] "RemoveContainer" containerID="7a6b96c58da60e255cd2be0a2630c58ca02cf2e3d940d1d432b9c7246294432c" Mar 08 07:02:44 crc kubenswrapper[4717]: I0308 07:02:44.782108 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:02:44 crc kubenswrapper[4717]: E0308 07:02:44.782518 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:02:55 crc kubenswrapper[4717]: I0308 07:02:55.781790 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:02:55 crc kubenswrapper[4717]: E0308 07:02:55.782579 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:03:04 crc kubenswrapper[4717]: I0308 07:03:04.792006 4717 generic.go:334] "Generic (PLEG): container finished" podID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerID="aedada7002bc7cd903862db28b3be4af657d9492ab80d72f555cb1fef158dea9" exitCode=0 Mar 08 07:03:04 crc kubenswrapper[4717]: I0308 07:03:04.793004 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" event={"ID":"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4","Type":"ContainerDied","Data":"aedada7002bc7cd903862db28b3be4af657d9492ab80d72f555cb1fef158dea9"} Mar 08 07:03:04 crc kubenswrapper[4717]: I0308 07:03:04.793950 4717 scope.go:117] "RemoveContainer" containerID="aedada7002bc7cd903862db28b3be4af657d9492ab80d72f555cb1fef158dea9" Mar 08 07:03:05 crc kubenswrapper[4717]: I0308 07:03:05.556459 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsgn_must-gather-gwp6z_9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4/gather/0.log" Mar 08 07:03:06 crc kubenswrapper[4717]: I0308 07:03:06.782781 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:03:06 crc kubenswrapper[4717]: E0308 07:03:06.783738 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:03:14 crc kubenswrapper[4717]: I0308 07:03:14.617263 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsgn/must-gather-gwp6z"] Mar 08 07:03:14 crc kubenswrapper[4717]: I0308 07:03:14.619083 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="copy" containerID="cri-o://28f714a42332a54d7a953c2e34d6c3fb49fb21e4759cb15f75f51d0d71abd5a1" gracePeriod=2 Mar 08 07:03:14 crc kubenswrapper[4717]: I0308 07:03:14.625374 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsgn/must-gather-gwp6z"] Mar 08 07:03:14 crc kubenswrapper[4717]: I0308 07:03:14.949727 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsgn_must-gather-gwp6z_9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4/copy/0.log" Mar 08 07:03:14 crc kubenswrapper[4717]: I0308 07:03:14.950656 4717 generic.go:334] "Generic (PLEG): container finished" podID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerID="28f714a42332a54d7a953c2e34d6c3fb49fb21e4759cb15f75f51d0d71abd5a1" exitCode=143 Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.086183 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsgn_must-gather-gwp6z_9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4/copy/0.log" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.086764 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.280747 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzjg4\" (UniqueName: \"kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4\") pod \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.280936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output\") pod \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\" (UID: \"9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4\") " Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.289422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4" (OuterVolumeSpecName: "kube-api-access-bzjg4") pod "9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" (UID: "9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4"). InnerVolumeSpecName "kube-api-access-bzjg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.383603 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzjg4\" (UniqueName: \"kubernetes.io/projected/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-kube-api-access-bzjg4\") on node \"crc\" DevicePath \"\"" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.483433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" (UID: "9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.485022 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.813059 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" path="/var/lib/kubelet/pods/9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4/volumes" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.963596 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsgn_must-gather-gwp6z_9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4/copy/0.log" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.964001 4717 scope.go:117] "RemoveContainer" containerID="28f714a42332a54d7a953c2e34d6c3fb49fb21e4759cb15f75f51d0d71abd5a1" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.964099 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsgn/must-gather-gwp6z" Mar 08 07:03:15 crc kubenswrapper[4717]: I0308 07:03:15.991065 4717 scope.go:117] "RemoveContainer" containerID="aedada7002bc7cd903862db28b3be4af657d9492ab80d72f555cb1fef158dea9" Mar 08 07:03:21 crc kubenswrapper[4717]: I0308 07:03:21.782580 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:03:21 crc kubenswrapper[4717]: E0308 07:03:21.783347 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:03:35 crc kubenswrapper[4717]: I0308 07:03:35.782950 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:03:35 crc kubenswrapper[4717]: E0308 07:03:35.783915 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:03:42 crc kubenswrapper[4717]: I0308 07:03:42.259322 4717 scope.go:117] "RemoveContainer" containerID="20decba40c27a2286c4e05415bc45caab45a1a637322d6d592996da59bf63708" Mar 08 07:03:49 crc kubenswrapper[4717]: I0308 07:03:49.782311 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:03:49 crc kubenswrapper[4717]: E0308 07:03:49.783721 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.175013 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549224-dr225"] Mar 08 07:04:00 crc kubenswrapper[4717]: E0308 07:04:00.176414 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9714ba72-0ac4-47ac-9afc-4f62b43ccf38" containerName="oc" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.176437 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9714ba72-0ac4-47ac-9afc-4f62b43ccf38" containerName="oc" Mar 08 07:04:00 crc kubenswrapper[4717]: E0308 07:04:00.176490 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="copy" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.176503 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="copy" Mar 08 07:04:00 crc kubenswrapper[4717]: E0308 07:04:00.176539 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="gather" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.176552 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="gather" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.176948 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9714ba72-0ac4-47ac-9afc-4f62b43ccf38" containerName="oc" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.176978 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="copy" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.177012 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9957046f-e4c3-4b9a-be5e-f0c2bdf67ab4" containerName="gather" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.178307 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.182576 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.182576 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.182585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.196289 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549224-dr225"] Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.332287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ps2\" (UniqueName: \"kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2\") pod \"auto-csr-approver-29549224-dr225\" (UID: \"0d07649a-00a6-490f-a0ff-7386ab7ca669\") " pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.434042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ps2\" (UniqueName: \"kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2\") pod \"auto-csr-approver-29549224-dr225\" (UID: \"0d07649a-00a6-490f-a0ff-7386ab7ca669\") " pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.455396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ps2\" (UniqueName: \"kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2\") pod \"auto-csr-approver-29549224-dr225\" (UID: \"0d07649a-00a6-490f-a0ff-7386ab7ca669\") " pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:00 crc kubenswrapper[4717]: I0308 07:04:00.554621 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:01 crc kubenswrapper[4717]: I0308 07:04:01.001503 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549224-dr225"] Mar 08 07:04:01 crc kubenswrapper[4717]: I0308 07:04:01.010507 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 07:04:01 crc kubenswrapper[4717]: I0308 07:04:01.521236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549224-dr225" event={"ID":"0d07649a-00a6-490f-a0ff-7386ab7ca669","Type":"ContainerStarted","Data":"c513d63f525ab5f3dfcb472827eba822500910635ca449c2141546545883bb84"} Mar 08 07:04:01 crc kubenswrapper[4717]: I0308 07:04:01.782647 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:04:01 crc kubenswrapper[4717]: E0308 07:04:01.783155 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:02 crc kubenswrapper[4717]: I0308 07:04:02.533457 4717 generic.go:334] "Generic (PLEG): container finished" podID="0d07649a-00a6-490f-a0ff-7386ab7ca669" containerID="99c33235fa34c78bde25c11d6e8447fa34d936525be1d95e073c021643372bb4" exitCode=0 Mar 08 07:04:02 crc kubenswrapper[4717]: I0308 07:04:02.533657 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549224-dr225" event={"ID":"0d07649a-00a6-490f-a0ff-7386ab7ca669","Type":"ContainerDied","Data":"99c33235fa34c78bde25c11d6e8447fa34d936525be1d95e073c021643372bb4"} Mar 08 07:04:03 crc kubenswrapper[4717]: I0308 07:04:03.940513 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.120238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9ps2\" (UniqueName: \"kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2\") pod \"0d07649a-00a6-490f-a0ff-7386ab7ca669\" (UID: \"0d07649a-00a6-490f-a0ff-7386ab7ca669\") " Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.125936 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2" (OuterVolumeSpecName: "kube-api-access-d9ps2") pod "0d07649a-00a6-490f-a0ff-7386ab7ca669" (UID: "0d07649a-00a6-490f-a0ff-7386ab7ca669"). InnerVolumeSpecName "kube-api-access-d9ps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.223172 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9ps2\" (UniqueName: \"kubernetes.io/projected/0d07649a-00a6-490f-a0ff-7386ab7ca669-kube-api-access-d9ps2\") on node \"crc\" DevicePath \"\"" Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.558940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549224-dr225" event={"ID":"0d07649a-00a6-490f-a0ff-7386ab7ca669","Type":"ContainerDied","Data":"c513d63f525ab5f3dfcb472827eba822500910635ca449c2141546545883bb84"} Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.558994 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c513d63f525ab5f3dfcb472827eba822500910635ca449c2141546545883bb84" Mar 08 07:04:04 crc kubenswrapper[4717]: I0308 07:04:04.559098 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549224-dr225" Mar 08 07:04:05 crc kubenswrapper[4717]: I0308 07:04:05.016680 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549218-ws8jp"] Mar 08 07:04:05 crc kubenswrapper[4717]: I0308 07:04:05.025199 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549218-ws8jp"] Mar 08 07:04:05 crc kubenswrapper[4717]: I0308 07:04:05.825130 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23243fde-3416-4769-9537-e21316883021" path="/var/lib/kubelet/pods/23243fde-3416-4769-9537-e21316883021/volumes" Mar 08 07:04:14 crc kubenswrapper[4717]: I0308 07:04:14.782259 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:04:14 crc kubenswrapper[4717]: E0308 07:04:14.783377 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:25 crc kubenswrapper[4717]: I0308 07:04:25.782581 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:04:25 crc kubenswrapper[4717]: E0308 07:04:25.783940 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.020597 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:36 crc kubenswrapper[4717]: E0308 07:04:36.021498 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d07649a-00a6-490f-a0ff-7386ab7ca669" containerName="oc" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.021512 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d07649a-00a6-490f-a0ff-7386ab7ca669" containerName="oc" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.021753 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d07649a-00a6-490f-a0ff-7386ab7ca669" containerName="oc" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.023095 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.037309 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.065016 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.065095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.065257 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcsd\" (UniqueName: \"kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.166923 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.167074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcsd\" (UniqueName: \"kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.167262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.167439 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.167672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.191400 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcsd\" (UniqueName: \"kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd\") pod \"redhat-marketplace-wr5fj\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.343198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.782323 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:04:36 crc kubenswrapper[4717]: E0308 07:04:36.782768 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.834573 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:36 crc kubenswrapper[4717]: I0308 07:04:36.963586 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerStarted","Data":"05f010f7747397c19d4897ce22d48ceda00fc34100ab13e8aedef1056a124390"} Mar 08 07:04:37 crc kubenswrapper[4717]: I0308 07:04:37.974179 4717 generic.go:334] "Generic (PLEG): container finished" podID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerID="989115562e9f7d9dfb7966e46b47df99e53718a2e5b4a416bd7f78d34444b628" exitCode=0 Mar 08 07:04:37 crc kubenswrapper[4717]: I0308 07:04:37.974268 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerDied","Data":"989115562e9f7d9dfb7966e46b47df99e53718a2e5b4a416bd7f78d34444b628"} Mar 08 07:04:38 crc kubenswrapper[4717]: I0308 07:04:38.989205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerStarted","Data":"49d0dff9cf8b1fb3f20d4ad75b643ad34f03e519141c94afba54f87930300b0b"} Mar 08 07:04:40 crc kubenswrapper[4717]: I0308 07:04:40.003864 4717 generic.go:334] "Generic (PLEG): container finished" podID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerID="49d0dff9cf8b1fb3f20d4ad75b643ad34f03e519141c94afba54f87930300b0b" exitCode=0 Mar 08 07:04:40 crc kubenswrapper[4717]: I0308 07:04:40.003958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerDied","Data":"49d0dff9cf8b1fb3f20d4ad75b643ad34f03e519141c94afba54f87930300b0b"} Mar 08 07:04:41 crc kubenswrapper[4717]: I0308 07:04:41.030775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerStarted","Data":"050ddc11b88294df219e40b1357f570b83be80e7e264e3153755c30021f5d9aa"} Mar 08 07:04:41 crc kubenswrapper[4717]: I0308 07:04:41.070416 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wr5fj" podStartSLOduration=2.6762821409999997 podStartE2EDuration="5.070390633s" podCreationTimestamp="2026-03-08 07:04:36 +0000 UTC" firstStartedPulling="2026-03-08 07:04:37.975670379 +0000 UTC m=+5904.893319233" lastFinishedPulling="2026-03-08 07:04:40.369778891 +0000 UTC m=+5907.287427725" observedRunningTime="2026-03-08 07:04:41.052448673 +0000 UTC m=+5907.970097547" watchObservedRunningTime="2026-03-08 07:04:41.070390633 +0000 UTC m=+5907.988039497" Mar 08 07:04:42 crc kubenswrapper[4717]: I0308 07:04:42.375782 4717 scope.go:117] "RemoveContainer" containerID="8c8384607ba0d268d4e3239b969696407618d30a0c1e34878c835a94f0a587ad" Mar 08 07:04:46 crc kubenswrapper[4717]: I0308 07:04:46.343448 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:46 crc kubenswrapper[4717]: I0308 07:04:46.343810 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:46 crc kubenswrapper[4717]: I0308 07:04:46.414507 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:47 crc kubenswrapper[4717]: I0308 07:04:47.176404 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:47 crc kubenswrapper[4717]: I0308 07:04:47.243518 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:49 crc kubenswrapper[4717]: I0308 07:04:49.134983 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wr5fj" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="registry-server" containerID="cri-o://050ddc11b88294df219e40b1357f570b83be80e7e264e3153755c30021f5d9aa" gracePeriod=2 Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.146971 4717 generic.go:334] "Generic (PLEG): container finished" podID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerID="050ddc11b88294df219e40b1357f570b83be80e7e264e3153755c30021f5d9aa" exitCode=0 Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.147066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerDied","Data":"050ddc11b88294df219e40b1357f570b83be80e7e264e3153755c30021f5d9aa"} Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.147354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wr5fj" event={"ID":"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20","Type":"ContainerDied","Data":"05f010f7747397c19d4897ce22d48ceda00fc34100ab13e8aedef1056a124390"} Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.147372 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f010f7747397c19d4897ce22d48ceda00fc34100ab13e8aedef1056a124390" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.195948 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.335148 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content\") pod \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.335279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcsd\" (UniqueName: \"kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd\") pod \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.335527 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities\") pod \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\" (UID: \"f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20\") " Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.336326 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities" (OuterVolumeSpecName: "utilities") pod "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" (UID: "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.343869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd" (OuterVolumeSpecName: "kube-api-access-frcsd") pod "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" (UID: "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20"). InnerVolumeSpecName "kube-api-access-frcsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.374202 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" (UID: "f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.437355 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcsd\" (UniqueName: \"kubernetes.io/projected/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-kube-api-access-frcsd\") on node \"crc\" DevicePath \"\"" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.437388 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.437398 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 07:04:50 crc kubenswrapper[4717]: I0308 07:04:50.783428 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:04:50 crc kubenswrapper[4717]: E0308 07:04:50.783959 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:04:51 crc kubenswrapper[4717]: I0308 07:04:51.160279 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wr5fj" Mar 08 07:04:51 crc kubenswrapper[4717]: I0308 07:04:51.211756 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:51 crc kubenswrapper[4717]: I0308 07:04:51.221029 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wr5fj"] Mar 08 07:04:51 crc kubenswrapper[4717]: I0308 07:04:51.806479 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" path="/var/lib/kubelet/pods/f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20/volumes" Mar 08 07:05:04 crc kubenswrapper[4717]: I0308 07:05:04.781380 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:05:04 crc kubenswrapper[4717]: E0308 07:05:04.782338 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:05:17 crc kubenswrapper[4717]: I0308 07:05:17.781423 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:05:17 crc kubenswrapper[4717]: E0308 07:05:17.782189 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:05:30 crc kubenswrapper[4717]: I0308 07:05:30.783654 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:05:30 crc kubenswrapper[4717]: E0308 07:05:30.785217 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:05:41 crc kubenswrapper[4717]: I0308 07:05:41.781562 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:05:41 crc kubenswrapper[4717]: E0308 07:05:41.782307 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:05:52 crc kubenswrapper[4717]: I0308 07:05:52.782218 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:05:52 crc kubenswrapper[4717]: E0308 07:05:52.783067 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.151502 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549226-9zmmh"] Mar 08 07:06:00 crc kubenswrapper[4717]: E0308 07:06:00.152657 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="registry-server" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.152674 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="registry-server" Mar 08 07:06:00 crc kubenswrapper[4717]: E0308 07:06:00.152716 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="extract-utilities" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.152725 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="extract-utilities" Mar 08 07:06:00 crc kubenswrapper[4717]: E0308 07:06:00.152744 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="extract-content" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.152751 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="extract-content" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.152977 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1df2bf6-2be1-4f28-a8c4-4d19eb5ffb20" containerName="registry-server" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.153814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.159494 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.160169 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.160537 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.171548 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549226-9zmmh"] Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.205609 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjt9\" (UniqueName: \"kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9\") pod \"auto-csr-approver-29549226-9zmmh\" (UID: \"e689532a-2392-48b1-8861-5c985aba7af7\") " pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.307554 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjt9\" (UniqueName: \"kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9\") pod \"auto-csr-approver-29549226-9zmmh\" (UID: \"e689532a-2392-48b1-8861-5c985aba7af7\") " pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.327968 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjt9\" (UniqueName: \"kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9\") pod \"auto-csr-approver-29549226-9zmmh\" (UID: \"e689532a-2392-48b1-8861-5c985aba7af7\") " pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:00 crc kubenswrapper[4717]: I0308 07:06:00.480667 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:01 crc kubenswrapper[4717]: I0308 07:06:00.985469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549226-9zmmh"] Mar 08 07:06:01 crc kubenswrapper[4717]: I0308 07:06:01.987650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" event={"ID":"e689532a-2392-48b1-8861-5c985aba7af7","Type":"ContainerStarted","Data":"b7a0801d45f623e7996ada709f9dca48152b11a5314c0f34fb9b5e82e48443e6"} Mar 08 07:06:02 crc kubenswrapper[4717]: I0308 07:06:02.997668 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" event={"ID":"e689532a-2392-48b1-8861-5c985aba7af7","Type":"ContainerStarted","Data":"8a9ff4bbd3fec364b1640e8301e634bc50fc7fd3eb4ed7eb0f6e97503445f899"} Mar 08 07:06:03 crc kubenswrapper[4717]: I0308 07:06:03.013141 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" podStartSLOduration=1.530850289 podStartE2EDuration="3.013120707s" podCreationTimestamp="2026-03-08 07:06:00 +0000 UTC" firstStartedPulling="2026-03-08 07:06:01.019829878 +0000 UTC m=+5987.937478722" lastFinishedPulling="2026-03-08 07:06:02.502100256 +0000 UTC m=+5989.419749140" observedRunningTime="2026-03-08 07:06:03.01120081 +0000 UTC m=+5989.928849664" watchObservedRunningTime="2026-03-08 07:06:03.013120707 +0000 UTC m=+5989.930769551" Mar 08 07:06:03 crc kubenswrapper[4717]: I0308 07:06:03.788803 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:06:03 crc kubenswrapper[4717]: E0308 07:06:03.789871 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:06:04 crc kubenswrapper[4717]: I0308 07:06:04.008904 4717 generic.go:334] "Generic (PLEG): container finished" podID="e689532a-2392-48b1-8861-5c985aba7af7" containerID="8a9ff4bbd3fec364b1640e8301e634bc50fc7fd3eb4ed7eb0f6e97503445f899" exitCode=0 Mar 08 07:06:04 crc kubenswrapper[4717]: I0308 07:06:04.008958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" event={"ID":"e689532a-2392-48b1-8861-5c985aba7af7","Type":"ContainerDied","Data":"8a9ff4bbd3fec364b1640e8301e634bc50fc7fd3eb4ed7eb0f6e97503445f899"} Mar 08 07:06:05 crc kubenswrapper[4717]: I0308 07:06:05.432440 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:05 crc kubenswrapper[4717]: I0308 07:06:05.525224 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkjt9\" (UniqueName: \"kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9\") pod \"e689532a-2392-48b1-8861-5c985aba7af7\" (UID: \"e689532a-2392-48b1-8861-5c985aba7af7\") " Mar 08 07:06:05 crc kubenswrapper[4717]: I0308 07:06:05.534121 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9" (OuterVolumeSpecName: "kube-api-access-bkjt9") pod "e689532a-2392-48b1-8861-5c985aba7af7" (UID: "e689532a-2392-48b1-8861-5c985aba7af7"). InnerVolumeSpecName "kube-api-access-bkjt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:06:05 crc kubenswrapper[4717]: I0308 07:06:05.628026 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkjt9\" (UniqueName: \"kubernetes.io/projected/e689532a-2392-48b1-8861-5c985aba7af7-kube-api-access-bkjt9\") on node \"crc\" DevicePath \"\"" Mar 08 07:06:06 crc kubenswrapper[4717]: I0308 07:06:06.030561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" event={"ID":"e689532a-2392-48b1-8861-5c985aba7af7","Type":"ContainerDied","Data":"b7a0801d45f623e7996ada709f9dca48152b11a5314c0f34fb9b5e82e48443e6"} Mar 08 07:06:06 crc kubenswrapper[4717]: I0308 07:06:06.030602 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7a0801d45f623e7996ada709f9dca48152b11a5314c0f34fb9b5e82e48443e6" Mar 08 07:06:06 crc kubenswrapper[4717]: I0308 07:06:06.030697 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549226-9zmmh" Mar 08 07:06:06 crc kubenswrapper[4717]: I0308 07:06:06.075224 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549220-ztvdz"] Mar 08 07:06:06 crc kubenswrapper[4717]: I0308 07:06:06.083801 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549220-ztvdz"] Mar 08 07:06:07 crc kubenswrapper[4717]: I0308 07:06:07.801502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f67f15a-7766-4e30-b56b-d4bdc49e5cdd" path="/var/lib/kubelet/pods/1f67f15a-7766-4e30-b56b-d4bdc49e5cdd/volumes" Mar 08 07:06:17 crc kubenswrapper[4717]: I0308 07:06:17.782525 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:06:17 crc kubenswrapper[4717]: E0308 07:06:17.783263 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:06:29 crc kubenswrapper[4717]: I0308 07:06:29.782939 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:06:29 crc kubenswrapper[4717]: E0308 07:06:29.783956 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:06:42 crc kubenswrapper[4717]: I0308 07:06:42.501539 4717 scope.go:117] "RemoveContainer" containerID="2270a8149ef74bd357dff94b9661a23fa028748183aeeb1eb926bee96a538cca" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.790757 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:06:43 crc kubenswrapper[4717]: E0308 07:06:43.791509 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.853574 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k72vp/must-gather-hcmhm"] Mar 08 07:06:43 crc kubenswrapper[4717]: E0308 07:06:43.853958 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e689532a-2392-48b1-8861-5c985aba7af7" containerName="oc" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.853974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e689532a-2392-48b1-8861-5c985aba7af7" containerName="oc" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.854168 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e689532a-2392-48b1-8861-5c985aba7af7" containerName="oc" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.856537 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.878404 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k72vp"/"kube-root-ca.crt" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.878734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-k72vp"/"default-dockercfg-h7v4x" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.878906 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k72vp"/"openshift-service-ca.crt" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.884491 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k72vp/must-gather-hcmhm"] Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.973891 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:43 crc kubenswrapper[4717]: I0308 07:06:43.974338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.076145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.076485 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.076952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.093251 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2\") pod \"must-gather-hcmhm\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.193250 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:06:44 crc kubenswrapper[4717]: I0308 07:06:44.755294 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k72vp/must-gather-hcmhm"] Mar 08 07:06:44 crc kubenswrapper[4717]: W0308 07:06:44.757051 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod299901e4_82e9_4b87_b34a_177d62deaa7b.slice/crio-9308bd8fb6a96244897992272c0d6d6358169b84861f79aeef42d7399564f65f WatchSource:0}: Error finding container 9308bd8fb6a96244897992272c0d6d6358169b84861f79aeef42d7399564f65f: Status 404 returned error can't find the container with id 9308bd8fb6a96244897992272c0d6d6358169b84861f79aeef42d7399564f65f Mar 08 07:06:45 crc kubenswrapper[4717]: I0308 07:06:45.459822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/must-gather-hcmhm" event={"ID":"299901e4-82e9-4b87-b34a-177d62deaa7b","Type":"ContainerStarted","Data":"cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263"} Mar 08 07:06:45 crc kubenswrapper[4717]: I0308 07:06:45.460332 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/must-gather-hcmhm" event={"ID":"299901e4-82e9-4b87-b34a-177d62deaa7b","Type":"ContainerStarted","Data":"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996"} Mar 08 07:06:45 crc kubenswrapper[4717]: I0308 07:06:45.460346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/must-gather-hcmhm" event={"ID":"299901e4-82e9-4b87-b34a-177d62deaa7b","Type":"ContainerStarted","Data":"9308bd8fb6a96244897992272c0d6d6358169b84861f79aeef42d7399564f65f"} Mar 08 07:06:45 crc kubenswrapper[4717]: I0308 07:06:45.492961 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k72vp/must-gather-hcmhm" podStartSLOduration=2.492944049 podStartE2EDuration="2.492944049s" podCreationTimestamp="2026-03-08 07:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 07:06:45.479608282 +0000 UTC m=+6032.397257136" watchObservedRunningTime="2026-03-08 07:06:45.492944049 +0000 UTC m=+6032.410592893" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.646264 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k72vp/crc-debug-gch2j"] Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.648472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.679492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.679810 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qt8j\" (UniqueName: \"kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.782169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qt8j\" (UniqueName: \"kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.782258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.782376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.805681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qt8j\" (UniqueName: \"kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j\") pod \"crc-debug-gch2j\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:48 crc kubenswrapper[4717]: I0308 07:06:48.967990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:06:49 crc kubenswrapper[4717]: I0308 07:06:49.498018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-gch2j" event={"ID":"a05b4cba-7f14-468e-adb7-138b3bb733ae","Type":"ContainerStarted","Data":"4b87603a7f42e143849d1a837ae59917645ea6a6a2aa39c97ce336e1c59429f8"} Mar 08 07:06:49 crc kubenswrapper[4717]: I0308 07:06:49.498754 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-gch2j" event={"ID":"a05b4cba-7f14-468e-adb7-138b3bb733ae","Type":"ContainerStarted","Data":"61b942170f8db39d670a8f0d80ca090c04a5d722df24fb24c1c3f1b35eeb2e6d"} Mar 08 07:06:49 crc kubenswrapper[4717]: I0308 07:06:49.514254 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k72vp/crc-debug-gch2j" podStartSLOduration=1.514236046 podStartE2EDuration="1.514236046s" podCreationTimestamp="2026-03-08 07:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 07:06:49.510927295 +0000 UTC m=+6036.428576139" watchObservedRunningTime="2026-03-08 07:06:49.514236046 +0000 UTC m=+6036.431884890" Mar 08 07:06:57 crc kubenswrapper[4717]: I0308 07:06:57.782000 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:06:57 crc kubenswrapper[4717]: E0308 07:06:57.782934 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:07:11 crc kubenswrapper[4717]: I0308 07:07:11.786748 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:07:12 crc kubenswrapper[4717]: I0308 07:07:12.710800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597"} Mar 08 07:07:30 crc kubenswrapper[4717]: I0308 07:07:30.901816 4717 generic.go:334] "Generic (PLEG): container finished" podID="a05b4cba-7f14-468e-adb7-138b3bb733ae" containerID="4b87603a7f42e143849d1a837ae59917645ea6a6a2aa39c97ce336e1c59429f8" exitCode=0 Mar 08 07:07:30 crc kubenswrapper[4717]: I0308 07:07:30.901892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-gch2j" event={"ID":"a05b4cba-7f14-468e-adb7-138b3bb733ae","Type":"ContainerDied","Data":"4b87603a7f42e143849d1a837ae59917645ea6a6a2aa39c97ce336e1c59429f8"} Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.017437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.051500 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-gch2j"] Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.060402 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-gch2j"] Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.064807 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host\") pod \"a05b4cba-7f14-468e-adb7-138b3bb733ae\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.065101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qt8j\" (UniqueName: \"kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j\") pod \"a05b4cba-7f14-468e-adb7-138b3bb733ae\" (UID: \"a05b4cba-7f14-468e-adb7-138b3bb733ae\") " Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.066504 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host" (OuterVolumeSpecName: "host") pod "a05b4cba-7f14-468e-adb7-138b3bb733ae" (UID: "a05b4cba-7f14-468e-adb7-138b3bb733ae"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.070839 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j" (OuterVolumeSpecName: "kube-api-access-5qt8j") pod "a05b4cba-7f14-468e-adb7-138b3bb733ae" (UID: "a05b4cba-7f14-468e-adb7-138b3bb733ae"). InnerVolumeSpecName "kube-api-access-5qt8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.167712 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qt8j\" (UniqueName: \"kubernetes.io/projected/a05b4cba-7f14-468e-adb7-138b3bb733ae-kube-api-access-5qt8j\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.167750 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a05b4cba-7f14-468e-adb7-138b3bb733ae-host\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.921161 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61b942170f8db39d670a8f0d80ca090c04a5d722df24fb24c1c3f1b35eeb2e6d" Mar 08 07:07:32 crc kubenswrapper[4717]: I0308 07:07:32.921219 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-gch2j" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.248436 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k72vp/crc-debug-ctwk4"] Mar 08 07:07:33 crc kubenswrapper[4717]: E0308 07:07:33.248826 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05b4cba-7f14-468e-adb7-138b3bb733ae" containerName="container-00" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.248839 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05b4cba-7f14-468e-adb7-138b3bb733ae" containerName="container-00" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.249002 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05b4cba-7f14-468e-adb7-138b3bb733ae" containerName="container-00" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.249581 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.289491 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4pg\" (UniqueName: \"kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.289561 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.391303 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4pg\" (UniqueName: \"kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.391410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.391542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.410228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4pg\" (UniqueName: \"kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg\") pod \"crc-debug-ctwk4\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.570005 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.793502 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05b4cba-7f14-468e-adb7-138b3bb733ae" path="/var/lib/kubelet/pods/a05b4cba-7f14-468e-adb7-138b3bb733ae/volumes" Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.930956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" event={"ID":"5757814d-9f4b-418f-b972-674c824db1dc","Type":"ContainerStarted","Data":"6aa1d56881f16e87457ac7a3962865e335782787ec84428edff6878f108953a8"} Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.930999 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" event={"ID":"5757814d-9f4b-418f-b972-674c824db1dc","Type":"ContainerStarted","Data":"c2bc899093a8cade91cb153bef029ce20f7b7250160cdf433ac0271909c61ee5"} Mar 08 07:07:33 crc kubenswrapper[4717]: I0308 07:07:33.942826 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" podStartSLOduration=0.942807805 podStartE2EDuration="942.807805ms" podCreationTimestamp="2026-03-08 07:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 07:07:33.941032382 +0000 UTC m=+6080.858681226" watchObservedRunningTime="2026-03-08 07:07:33.942807805 +0000 UTC m=+6080.860456649" Mar 08 07:07:34 crc kubenswrapper[4717]: I0308 07:07:34.943233 4717 generic.go:334] "Generic (PLEG): container finished" podID="5757814d-9f4b-418f-b972-674c824db1dc" containerID="6aa1d56881f16e87457ac7a3962865e335782787ec84428edff6878f108953a8" exitCode=0 Mar 08 07:07:34 crc kubenswrapper[4717]: I0308 07:07:34.943340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" event={"ID":"5757814d-9f4b-418f-b972-674c824db1dc","Type":"ContainerDied","Data":"6aa1d56881f16e87457ac7a3962865e335782787ec84428edff6878f108953a8"} Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.045819 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.136233 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc4pg\" (UniqueName: \"kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg\") pod \"5757814d-9f4b-418f-b972-674c824db1dc\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.136292 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host\") pod \"5757814d-9f4b-418f-b972-674c824db1dc\" (UID: \"5757814d-9f4b-418f-b972-674c824db1dc\") " Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.136422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host" (OuterVolumeSpecName: "host") pod "5757814d-9f4b-418f-b972-674c824db1dc" (UID: "5757814d-9f4b-418f-b972-674c824db1dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.136784 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5757814d-9f4b-418f-b972-674c824db1dc-host\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.159960 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg" (OuterVolumeSpecName: "kube-api-access-xc4pg") pod "5757814d-9f4b-418f-b972-674c824db1dc" (UID: "5757814d-9f4b-418f-b972-674c824db1dc"). InnerVolumeSpecName "kube-api-access-xc4pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.239088 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc4pg\" (UniqueName: \"kubernetes.io/projected/5757814d-9f4b-418f-b972-674c824db1dc-kube-api-access-xc4pg\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.305960 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-ctwk4"] Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.314343 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-ctwk4"] Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.960654 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bc899093a8cade91cb153bef029ce20f7b7250160cdf433ac0271909c61ee5" Mar 08 07:07:36 crc kubenswrapper[4717]: I0308 07:07:36.961082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-ctwk4" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.490547 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k72vp/crc-debug-b4p8n"] Mar 08 07:07:37 crc kubenswrapper[4717]: E0308 07:07:37.490997 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5757814d-9f4b-418f-b972-674c824db1dc" containerName="container-00" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.491010 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5757814d-9f4b-418f-b972-674c824db1dc" containerName="container-00" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.491241 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5757814d-9f4b-418f-b972-674c824db1dc" containerName="container-00" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.491969 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.564423 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74mh\" (UniqueName: \"kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.564497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.667393 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74mh\" (UniqueName: \"kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.667503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.667648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.691333 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74mh\" (UniqueName: \"kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh\") pod \"crc-debug-b4p8n\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.794800 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5757814d-9f4b-418f-b972-674c824db1dc" path="/var/lib/kubelet/pods/5757814d-9f4b-418f-b972-674c824db1dc/volumes" Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.807509 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:37 crc kubenswrapper[4717]: W0308 07:07:37.841952 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb73b6a1d_6f84_408a_a15d_a313242eccc7.slice/crio-419e53a12b4d472c5118aa7f990db7a266ed8dc05224b9e7bcb1c3faedad6507 WatchSource:0}: Error finding container 419e53a12b4d472c5118aa7f990db7a266ed8dc05224b9e7bcb1c3faedad6507: Status 404 returned error can't find the container with id 419e53a12b4d472c5118aa7f990db7a266ed8dc05224b9e7bcb1c3faedad6507 Mar 08 07:07:37 crc kubenswrapper[4717]: I0308 07:07:37.970703 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" event={"ID":"b73b6a1d-6f84-408a-a15d-a313242eccc7","Type":"ContainerStarted","Data":"419e53a12b4d472c5118aa7f990db7a266ed8dc05224b9e7bcb1c3faedad6507"} Mar 08 07:07:38 crc kubenswrapper[4717]: I0308 07:07:38.991653 4717 generic.go:334] "Generic (PLEG): container finished" podID="b73b6a1d-6f84-408a-a15d-a313242eccc7" containerID="ac447e159fae37ea5be7ea8d52886b7c7c0962da7ab177f77d87ec73682d2362" exitCode=0 Mar 08 07:07:38 crc kubenswrapper[4717]: I0308 07:07:38.991701 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" event={"ID":"b73b6a1d-6f84-408a-a15d-a313242eccc7","Type":"ContainerDied","Data":"ac447e159fae37ea5be7ea8d52886b7c7c0962da7ab177f77d87ec73682d2362"} Mar 08 07:07:39 crc kubenswrapper[4717]: I0308 07:07:39.034759 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-b4p8n"] Mar 08 07:07:39 crc kubenswrapper[4717]: I0308 07:07:39.043547 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k72vp/crc-debug-b4p8n"] Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.111855 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.216231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host\") pod \"b73b6a1d-6f84-408a-a15d-a313242eccc7\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.216377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host" (OuterVolumeSpecName: "host") pod "b73b6a1d-6f84-408a-a15d-a313242eccc7" (UID: "b73b6a1d-6f84-408a-a15d-a313242eccc7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.216763 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74mh\" (UniqueName: \"kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh\") pod \"b73b6a1d-6f84-408a-a15d-a313242eccc7\" (UID: \"b73b6a1d-6f84-408a-a15d-a313242eccc7\") " Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.217364 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b73b6a1d-6f84-408a-a15d-a313242eccc7-host\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.222797 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh" (OuterVolumeSpecName: "kube-api-access-z74mh") pod "b73b6a1d-6f84-408a-a15d-a313242eccc7" (UID: "b73b6a1d-6f84-408a-a15d-a313242eccc7"). InnerVolumeSpecName "kube-api-access-z74mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:07:40 crc kubenswrapper[4717]: I0308 07:07:40.319161 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74mh\" (UniqueName: \"kubernetes.io/projected/b73b6a1d-6f84-408a-a15d-a313242eccc7-kube-api-access-z74mh\") on node \"crc\" DevicePath \"\"" Mar 08 07:07:41 crc kubenswrapper[4717]: I0308 07:07:41.015223 4717 scope.go:117] "RemoveContainer" containerID="ac447e159fae37ea5be7ea8d52886b7c7c0962da7ab177f77d87ec73682d2362" Mar 08 07:07:41 crc kubenswrapper[4717]: I0308 07:07:41.015311 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/crc-debug-b4p8n" Mar 08 07:07:41 crc kubenswrapper[4717]: I0308 07:07:41.793125 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73b6a1d-6f84-408a-a15d-a313242eccc7" path="/var/lib/kubelet/pods/b73b6a1d-6f84-408a-a15d-a313242eccc7/volumes" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.139303 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549228-7mmbs"] Mar 08 07:08:00 crc kubenswrapper[4717]: E0308 07:08:00.140411 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73b6a1d-6f84-408a-a15d-a313242eccc7" containerName="container-00" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.140432 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73b6a1d-6f84-408a-a15d-a313242eccc7" containerName="container-00" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.140726 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73b6a1d-6f84-408a-a15d-a313242eccc7" containerName="container-00" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.141563 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.143300 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.144482 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.145815 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.149176 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549228-7mmbs"] Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.255221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sddm\" (UniqueName: \"kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm\") pod \"auto-csr-approver-29549228-7mmbs\" (UID: \"10188bd2-f97d-4a4c-8695-1920886babf9\") " pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.357417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sddm\" (UniqueName: \"kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm\") pod \"auto-csr-approver-29549228-7mmbs\" (UID: \"10188bd2-f97d-4a4c-8695-1920886babf9\") " pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.376428 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sddm\" (UniqueName: \"kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm\") pod \"auto-csr-approver-29549228-7mmbs\" (UID: \"10188bd2-f97d-4a4c-8695-1920886babf9\") " pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.476413 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:00 crc kubenswrapper[4717]: I0308 07:08:00.945949 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549228-7mmbs"] Mar 08 07:08:01 crc kubenswrapper[4717]: I0308 07:08:01.228057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" event={"ID":"10188bd2-f97d-4a4c-8695-1920886babf9","Type":"ContainerStarted","Data":"03cd092326049920ddbe84bfb6d9585f959259a32b207c9b6acc8956535a8938"} Mar 08 07:08:02 crc kubenswrapper[4717]: I0308 07:08:02.243266 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" event={"ID":"10188bd2-f97d-4a4c-8695-1920886babf9","Type":"ContainerStarted","Data":"eb069b774e4977079c1beac49e9fa35a5ade38265da1aafc261c2802375382b2"} Mar 08 07:08:02 crc kubenswrapper[4717]: I0308 07:08:02.265157 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" podStartSLOduration=1.400281868 podStartE2EDuration="2.265132601s" podCreationTimestamp="2026-03-08 07:08:00 +0000 UTC" firstStartedPulling="2026-03-08 07:08:00.963634998 +0000 UTC m=+6107.881283842" lastFinishedPulling="2026-03-08 07:08:01.828485731 +0000 UTC m=+6108.746134575" observedRunningTime="2026-03-08 07:08:02.255880754 +0000 UTC m=+6109.173529598" watchObservedRunningTime="2026-03-08 07:08:02.265132601 +0000 UTC m=+6109.182781485" Mar 08 07:08:03 crc kubenswrapper[4717]: I0308 07:08:03.254563 4717 generic.go:334] "Generic (PLEG): container finished" podID="10188bd2-f97d-4a4c-8695-1920886babf9" containerID="eb069b774e4977079c1beac49e9fa35a5ade38265da1aafc261c2802375382b2" exitCode=0 Mar 08 07:08:03 crc kubenswrapper[4717]: I0308 07:08:03.254808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" event={"ID":"10188bd2-f97d-4a4c-8695-1920886babf9","Type":"ContainerDied","Data":"eb069b774e4977079c1beac49e9fa35a5ade38265da1aafc261c2802375382b2"} Mar 08 07:08:04 crc kubenswrapper[4717]: I0308 07:08:04.693660 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:04 crc kubenswrapper[4717]: I0308 07:08:04.838966 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sddm\" (UniqueName: \"kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm\") pod \"10188bd2-f97d-4a4c-8695-1920886babf9\" (UID: \"10188bd2-f97d-4a4c-8695-1920886babf9\") " Mar 08 07:08:04 crc kubenswrapper[4717]: I0308 07:08:04.847010 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm" (OuterVolumeSpecName: "kube-api-access-5sddm") pod "10188bd2-f97d-4a4c-8695-1920886babf9" (UID: "10188bd2-f97d-4a4c-8695-1920886babf9"). InnerVolumeSpecName "kube-api-access-5sddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:08:04 crc kubenswrapper[4717]: I0308 07:08:04.941848 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sddm\" (UniqueName: \"kubernetes.io/projected/10188bd2-f97d-4a4c-8695-1920886babf9-kube-api-access-5sddm\") on node \"crc\" DevicePath \"\"" Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.289975 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" event={"ID":"10188bd2-f97d-4a4c-8695-1920886babf9","Type":"ContainerDied","Data":"03cd092326049920ddbe84bfb6d9585f959259a32b207c9b6acc8956535a8938"} Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.290035 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03cd092326049920ddbe84bfb6d9585f959259a32b207c9b6acc8956535a8938" Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.290111 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549228-7mmbs" Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.334119 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549222-h24gp"] Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.351288 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549222-h24gp"] Mar 08 07:08:05 crc kubenswrapper[4717]: I0308 07:08:05.796612 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9714ba72-0ac4-47ac-9afc-4f62b43ccf38" path="/var/lib/kubelet/pods/9714ba72-0ac4-47ac-9afc-4f62b43ccf38/volumes" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.520008 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-569c95cff8-l9lj5_24f496e4-7d01-447c-9ebb-9da7b333d817/barbican-api/0.log" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.681002 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-569c95cff8-l9lj5_24f496e4-7d01-447c-9ebb-9da7b333d817/barbican-api-log/0.log" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.772824 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79d58bf7f8-bq7ms_47805c28-c90d-4882-a0ed-5e531fb545b4/barbican-keystone-listener/0.log" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.851426 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-79d58bf7f8-bq7ms_47805c28-c90d-4882-a0ed-5e531fb545b4/barbican-keystone-listener-log/0.log" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.964991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f775c4c7-brpx4_d7bf9dbc-ec82-4659-92fe-509f95574ef3/barbican-worker/0.log" Mar 08 07:08:21 crc kubenswrapper[4717]: I0308 07:08:21.978707 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f775c4c7-brpx4_d7bf9dbc-ec82-4659-92fe-509f95574ef3/barbican-worker-log/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.172355 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jr6fz_d8682143-56c7-442e-987a-d9da77fbe879/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.304994 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/ceilometer-central-agent/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.325565 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/ceilometer-notification-agent/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.401987 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/proxy-httpd/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.422942 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_34d60e99-8898-4576-b35a-8323db25511c/sg-core/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.602321 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5e36763c-a3c1-424c-8982-1af635ee7100/cinder-api-log/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.873468 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5e36763c-a3c1-424c-8982-1af635ee7100/cinder-api/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.890067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8fc8ad72-3a80-4520-8387-11aeb8bca94f/cinder-scheduler/0.log" Mar 08 07:08:22 crc kubenswrapper[4717]: I0308 07:08:22.909589 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8fc8ad72-3a80-4520-8387-11aeb8bca94f/probe/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.062949 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5dfzm_fbffff61-9614-4594-b52e-be489d2b2f22/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.098769 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-xd4v9_d9973d9b-7167-4a1b-9115-a7fb9a2921c3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.247442 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/init/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.413488 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/init/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.441166 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zmr2c_099dec1f-b123-4da0-a81f-52ee1b27d5df/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.563922 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9f5648895-t45xw_b5a2e6e1-5cab-4e70-b0a2-0d7d6eccc1f9/dnsmasq-dns/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.620014 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ffc0380-502c-48b0-b36a-8421c5503fde/glance-httpd/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.685778 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ffc0380-502c-48b0-b36a-8421c5503fde/glance-log/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.849524 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c715384f-21a1-490a-9432-1fef4658f5bd/glance-httpd/0.log" Mar 08 07:08:23 crc kubenswrapper[4717]: I0308 07:08:23.856206 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c715384f-21a1-490a-9432-1fef4658f5bd/glance-log/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.100966 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69bcb664dd-nb94m_9ab815c4-1b4d-499a-af69-f5e5907c9542/horizon/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.264086 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c4mbw_534637bd-8579-46f3-bee7-d6270aa8130c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.428310 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5vnmd_ef634a99-41c2-496a-b06e-d697710e676d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.628491 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69bcb664dd-nb94m_9ab815c4-1b4d-499a-af69-f5e5907c9542/horizon-log/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.704815 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29549161-72gqq_8a320a32-4b25-423c-9e3c-5ca2d08652c5/keystone-cron/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.944294 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5799fc9f64-fmph6_4270902e-1721-4286-be1f-baadb9dc68c1/keystone-api/0.log" Mar 08 07:08:24 crc kubenswrapper[4717]: I0308 07:08:24.961719 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29549221-mwzz7_3c258b2f-d522-4be9-a894-01fdd5289ebe/keystone-cron/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.094923 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_99fa4aa1-7000-4df4-8c35-d9bf87df65f3/kube-state-metrics/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.163971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tbjvm_a99f6dd1-e80a-4191-b85a-31042a1d9fc0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.654318 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ldxq8_ced2f113-4928-44e4-a34a-3ff2a669dec6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.667019 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f758875-jsp86_7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494/neutron-httpd/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.690579 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f758875-jsp86_7d3c60ea-d0ec-4d03-a5d5-6f5db5fb1494/neutron-api/0.log" Mar 08 07:08:25 crc kubenswrapper[4717]: I0308 07:08:25.828962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/setup-container/0.log" Mar 08 07:08:26 crc kubenswrapper[4717]: I0308 07:08:26.025770 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/rabbitmq/0.log" Mar 08 07:08:26 crc kubenswrapper[4717]: I0308 07:08:26.056969 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_f4f381c5-5d04-4cb1-9923-5d1d5edaf3ec/setup-container/0.log" Mar 08 07:08:26 crc kubenswrapper[4717]: I0308 07:08:26.619385 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_398b8cef-0b8b-4e8f-80c2-2afa74fa75be/nova-cell0-conductor-conductor/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.099480 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_65a7fb6f-308f-468e-8e2c-7adc53b2eb15/nova-cell1-conductor-conductor/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.301646 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bf251d2f-577d-4de2-ac4b-f51dc79add8d/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.520332 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a5123f2-ce36-401f-90d0-885684623a99/nova-api-log/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.803175 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rct9m_adf01f26-1066-4901-aa10-cd145a720cd6/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.867150 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5315a88e-0b28-4ad7-bc83-278711f6fb29/nova-metadata-log/0.log" Mar 08 07:08:27 crc kubenswrapper[4717]: I0308 07:08:27.991741 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2a5123f2-ce36-401f-90d0-885684623a99/nova-api-api/0.log" Mar 08 07:08:28 crc kubenswrapper[4717]: I0308 07:08:28.342138 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/mysql-bootstrap/0.log" Mar 08 07:08:28 crc kubenswrapper[4717]: I0308 07:08:28.442680 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fea1dcce-25a3-4f13-960a-5b08bf49e521/nova-scheduler-scheduler/0.log" Mar 08 07:08:28 crc kubenswrapper[4717]: I0308 07:08:28.584212 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/mysql-bootstrap/0.log" Mar 08 07:08:28 crc kubenswrapper[4717]: I0308 07:08:28.620283 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9e4e6ff9-db68-44fc-a8d2-de9471a74f19/galera/0.log" Mar 08 07:08:28 crc kubenswrapper[4717]: I0308 07:08:28.856134 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/mysql-bootstrap/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.031612 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/galera/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.069036 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_739f45be-d031-4f80-9c39-1683ddff1289/mysql-bootstrap/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.295619 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9a66e3e0-63d9-4ca4-ab60-8a842f37cc68/openstackclient/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.360334 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mcfsn_52c7f6df-8563-4181-bc1e-6fb4c3dd2126/ovn-controller/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.541973 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jfvhf_047dda74-4541-43e2-bc0f-ebdd951d1dbf/openstack-network-exporter/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.722972 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server-init/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.934883 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server/0.log" Mar 08 07:08:29 crc kubenswrapper[4717]: I0308 07:08:29.947631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovsdb-server-init/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.331887 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4c5fb_0bed90a3-1840-4f1a-a71b-cad45398bd15/ovs-vswitchd/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.367419 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5315a88e-0b28-4ad7-bc83-278711f6fb29/nova-metadata-metadata/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.381597 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2c94z_fed1792b-78eb-43bf-9e33-276a5b4477f7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.515043 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bde07dc5-6141-42e2-b280-d4df5ebe3d61/openstack-network-exporter/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.617774 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bde07dc5-6141-42e2-b280-d4df5ebe3d61/ovn-northd/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.752875 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_353c8ab9-3710-4290-b5c4-b93339baf4da/openstack-network-exporter/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.799594 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_353c8ab9-3710-4290-b5c4-b93339baf4da/ovsdbserver-nb/0.log" Mar 08 07:08:30 crc kubenswrapper[4717]: I0308 07:08:30.899467 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0427b7fd-2766-4b7f-bb23-96df1b2f4f5c/openstack-network-exporter/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.026511 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0427b7fd-2766-4b7f-bb23-96df1b2f4f5c/ovsdbserver-sb/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.198847 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c49bc6878-t8tg8_bb4d403c-6eb6-401c-9b4b-734c6adf3828/placement-api/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.305099 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/init-config-reloader/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.315020 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c49bc6878-t8tg8_bb4d403c-6eb6-401c-9b4b-734c6adf3828/placement-log/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.544044 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/thanos-sidecar/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.611724 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/init-config-reloader/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.621403 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/config-reloader/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.661470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f60230bf-f6a0-4a30-8d32-fd3ec01cf27a/prometheus/0.log" Mar 08 07:08:31 crc kubenswrapper[4717]: I0308 07:08:31.809447 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/setup-container/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.086640 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/rabbitmq/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.098606 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/setup-container/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.112694 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4a7bf3f6-cc6f-4e57-9b1e-3b1b54c0a1f7/setup-container/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.366252 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/setup-container/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.390358 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zghmp_228cb615-a265-435c-bca0-5cb037e311b6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.426865 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c1331e99-d131-4f8c-ae4e-6217cf54ddaf/rabbitmq/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.602642 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kqcr2_9af81be1-1bd6-46d1-ab21-d61cd769fd21/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.688312 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-74h5h_9cff8432-fc11-45e4-9e59-9abfdc356b44/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.842589 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hdxjp_e60e7e38-c8ac-4b48-bfc0-04e5b8e56874/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:32 crc kubenswrapper[4717]: I0308 07:08:32.959403 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz2kg_d4a65517-456b-4bf4-9e4b-cea94baeb6a7/ssh-known-hosts-edpm-deployment/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.263645 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-cc47695ff-btlzb_050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6/proxy-server/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.305758 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-cc47695ff-btlzb_050a571c-d8dd-4a97-a3cf-4c5d5b00b7e6/proxy-httpd/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.336288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-wc6z2_03d2941c-7434-4961-a7ea-fdff878a1128/swift-ring-rebalance/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.460314 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-auditor/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.727332 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-reaper/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.786140 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-replicator/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.868315 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/account-server/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.980022 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-auditor/0.log" Mar 08 07:08:33 crc kubenswrapper[4717]: I0308 07:08:33.990529 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-server/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.014611 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-replicator/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.157677 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/container-updater/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.207013 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-auditor/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.266962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-expirer/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.313061 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-replicator/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.432880 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-server/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.501670 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/object-updater/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.517111 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/swift-recon-cron/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.529867 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_67a11de8-b5e8-40d8-a451-1bece45918d8/rsync/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.753732 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0e0647cd-807a-44fc-a1e0-f5ce609b835d/tempest-tests-tempest-tests-runner/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.849776 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-g9vrj_2dabb1b7-df9b-4b70-94dc-d9e29be0856f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:34 crc kubenswrapper[4717]: I0308 07:08:34.979328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d536dc5d-11cf-4b1a-81bc-a17b532c9baa/test-operator-logs-container/0.log" Mar 08 07:08:35 crc kubenswrapper[4717]: I0308 07:08:35.099312 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kc72w_f1249858-ba3a-4c6e-af8a-b7784e9795a0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 08 07:08:36 crc kubenswrapper[4717]: I0308 07:08:36.091671 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_58162927-f626-43d8-a792-507cf584db78/watcher-applier/0.log" Mar 08 07:08:36 crc kubenswrapper[4717]: I0308 07:08:36.785966 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_acde9c29-0910-40cc-9da8-06a566c67b4c/watcher-api-log/0.log" Mar 08 07:08:39 crc kubenswrapper[4717]: I0308 07:08:39.056446 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_e1b3240f-d4a8-409e-a8bf-a2f2d03ac126/watcher-decision-engine/0.log" Mar 08 07:08:40 crc kubenswrapper[4717]: I0308 07:08:40.454456 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_acde9c29-0910-40cc-9da8-06a566c67b4c/watcher-api/0.log" Mar 08 07:08:42 crc kubenswrapper[4717]: I0308 07:08:42.664926 4717 scope.go:117] "RemoveContainer" containerID="f304960f7f7ca3418138cdf14afd5033741672e19ef5255cf46230a950af72c4" Mar 08 07:08:52 crc kubenswrapper[4717]: I0308 07:08:52.634241 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ee8a4411-d973-4eeb-b6cd-eb0844e7826e/memcached/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.336990 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.531457 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.537545 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.554775 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.724873 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/util/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.729185 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/extract/0.log" Mar 08 07:09:06 crc kubenswrapper[4717]: I0308 07:09:06.730537 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99tfwn8_c6e8615c-2151-4a0a-93e3-0638f91ab76c/pull/0.log" Mar 08 07:09:07 crc kubenswrapper[4717]: I0308 07:09:07.106294 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-wdwbm_bf98a4b8-6e3c-423d-b228-347c527e6721/manager/0.log" Mar 08 07:09:07 crc kubenswrapper[4717]: I0308 07:09:07.456780 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-vbdls_11010a39-3786-472f-ad04-805c35647afc/manager/0.log" Mar 08 07:09:07 crc kubenswrapper[4717]: I0308 07:09:07.653797 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-vpqhf_74fc8d21-150d-4009-b0ba-b6a47db5adbb/manager/0.log" Mar 08 07:09:07 crc kubenswrapper[4717]: I0308 07:09:07.911450 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-s8gsj_9fe70b75-885a-402b-98e1-f5c696e47f48/manager/0.log" Mar 08 07:09:08 crc kubenswrapper[4717]: I0308 07:09:08.306337 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-x44f7_9c64b2a6-6663-46cc-b762-bffa01baeb47/manager/0.log" Mar 08 07:09:08 crc kubenswrapper[4717]: I0308 07:09:08.822426 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wmjsb_7668ece6-7b88-4707-baf2-62379071cf43/manager/0.log" Mar 08 07:09:08 crc kubenswrapper[4717]: I0308 07:09:08.849106 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vrnlx_be14026d-4e86-4134-8f2a-617e9272d2a1/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.031525 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-mp5dj_7eda0f52-4fcf-46fe-b329-075fb4d79c74/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.333401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-959nd_148c1a2c-7098-4111-a12e-02e2dcc295a6/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.538100 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-x5kbt_8f3bb097-82e6-4fe8-ad89-48004c80477b/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.646233 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-hl7k4_999e5f1a-4be7-4716-8999-e28027c618b9/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.898625 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-cmn95_44e5de82-d168-400e-801f-1f122a08c656/manager/0.log" Mar 08 07:09:09 crc kubenswrapper[4717]: I0308 07:09:09.927426 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-htv95_17485954-f1e6-4042-9338-ad5115801764/manager/0.log" Mar 08 07:09:10 crc kubenswrapper[4717]: I0308 07:09:10.088193 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-87x6n_f09b3f70-1158-4269-abf3-acf3fecc0cb9/manager/0.log" Mar 08 07:09:10 crc kubenswrapper[4717]: I0308 07:09:10.416386 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-4xgmb_003c1f39-7ea2-4391-87f9-875cbdf6e1cc/operator/0.log" Mar 08 07:09:10 crc kubenswrapper[4717]: I0308 07:09:10.795265 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bdxpf_33dbebe0-8cce-49d5-afc5-287c2c188438/registry-server/0.log" Mar 08 07:09:10 crc kubenswrapper[4717]: I0308 07:09:10.943817 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-dfmch_3cd0ad0a-7a9e-4870-8b76-58f975cd36e4/manager/0.log" Mar 08 07:09:11 crc kubenswrapper[4717]: I0308 07:09:11.059847 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-bv6fm_ae7df2ae-3ad9-4c73-a957-fe35b87703ec/manager/0.log" Mar 08 07:09:11 crc kubenswrapper[4717]: I0308 07:09:11.172509 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4bqgl_7da8d6da-69ae-4351-a774-20888648eac2/operator/0.log" Mar 08 07:09:11 crc kubenswrapper[4717]: I0308 07:09:11.480652 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-djsmm_d7c1a0d3-1242-402f-88a3-6d45d4c6661a/manager/0.log" Mar 08 07:09:11 crc kubenswrapper[4717]: I0308 07:09:11.690326 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-cvjxp_4a173035-b1d9-4435-a2d1-b29e9bea39be/manager/0.log" Mar 08 07:09:11 crc kubenswrapper[4717]: I0308 07:09:11.927762 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-6hf5q_3689217b-f2db-4d81-8e68-7f728ce20860/manager/0.log" Mar 08 07:09:12 crc kubenswrapper[4717]: I0308 07:09:12.029237 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-hbqjn_56fb95c9-5d2d-4c31-b5ca-d97f91ef8ca5/manager/0.log" Mar 08 07:09:12 crc kubenswrapper[4717]: I0308 07:09:12.216445 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-8wwfb_1fdfe5d1-1d8b-4016-843f-6ba5703c9f6b/manager/0.log" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.781820 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:14 crc kubenswrapper[4717]: E0308 07:09:14.782964 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10188bd2-f97d-4a4c-8695-1920886babf9" containerName="oc" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.782979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="10188bd2-f97d-4a4c-8695-1920886babf9" containerName="oc" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.783160 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="10188bd2-f97d-4a4c-8695-1920886babf9" containerName="oc" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.784922 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.798574 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.873938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.874205 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjt8\" (UniqueName: \"kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.874258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.977808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjt8\" (UniqueName: \"kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.977853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.977893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.978406 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.978633 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:14 crc kubenswrapper[4717]: I0308 07:09:14.996607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjt8\" (UniqueName: \"kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8\") pod \"certified-operators-mdzd4\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.116505 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.671666 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.996715 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerID="34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57" exitCode=0 Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.996766 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerDied","Data":"34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57"} Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.996799 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerStarted","Data":"8f1a9e22216a3cc2f2198aee428589d2be972df9ac7fb53f647c80932120ed4f"} Mar 08 07:09:15 crc kubenswrapper[4717]: I0308 07:09:15.998955 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 07:09:17 crc kubenswrapper[4717]: I0308 07:09:17.008484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerStarted","Data":"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e"} Mar 08 07:09:18 crc kubenswrapper[4717]: I0308 07:09:18.461022 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-7x79h_8e10706f-2cf2-4b11-a084-33df5b7fe0a1/manager/0.log" Mar 08 07:09:19 crc kubenswrapper[4717]: I0308 07:09:19.035441 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerID="00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e" exitCode=0 Mar 08 07:09:19 crc kubenswrapper[4717]: I0308 07:09:19.035476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerDied","Data":"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e"} Mar 08 07:09:20 crc kubenswrapper[4717]: I0308 07:09:20.046589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerStarted","Data":"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e"} Mar 08 07:09:20 crc kubenswrapper[4717]: I0308 07:09:20.069726 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdzd4" podStartSLOduration=2.618443373 podStartE2EDuration="6.069706585s" podCreationTimestamp="2026-03-08 07:09:14 +0000 UTC" firstStartedPulling="2026-03-08 07:09:15.998667439 +0000 UTC m=+6182.916316283" lastFinishedPulling="2026-03-08 07:09:19.449930651 +0000 UTC m=+6186.367579495" observedRunningTime="2026-03-08 07:09:20.063434671 +0000 UTC m=+6186.981083525" watchObservedRunningTime="2026-03-08 07:09:20.069706585 +0000 UTC m=+6186.987355429" Mar 08 07:09:25 crc kubenswrapper[4717]: I0308 07:09:25.117171 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:25 crc kubenswrapper[4717]: I0308 07:09:25.117832 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:25 crc kubenswrapper[4717]: I0308 07:09:25.162712 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:26 crc kubenswrapper[4717]: I0308 07:09:26.177795 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:26 crc kubenswrapper[4717]: I0308 07:09:26.240649 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.135254 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mdzd4" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="registry-server" containerID="cri-o://d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e" gracePeriod=2 Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.660195 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.774005 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxjt8\" (UniqueName: \"kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8\") pod \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.774071 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities\") pod \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.774132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content\") pod \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\" (UID: \"d8ef1b4b-5400-4c4c-8efe-58a442949c8e\") " Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.775272 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities" (OuterVolumeSpecName: "utilities") pod "d8ef1b4b-5400-4c4c-8efe-58a442949c8e" (UID: "d8ef1b4b-5400-4c4c-8efe-58a442949c8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.779779 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8" (OuterVolumeSpecName: "kube-api-access-vxjt8") pod "d8ef1b4b-5400-4c4c-8efe-58a442949c8e" (UID: "d8ef1b4b-5400-4c4c-8efe-58a442949c8e"). InnerVolumeSpecName "kube-api-access-vxjt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.828259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8ef1b4b-5400-4c4c-8efe-58a442949c8e" (UID: "d8ef1b4b-5400-4c4c-8efe-58a442949c8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.876989 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxjt8\" (UniqueName: \"kubernetes.io/projected/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-kube-api-access-vxjt8\") on node \"crc\" DevicePath \"\"" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.877033 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 07:09:28 crc kubenswrapper[4717]: I0308 07:09:28.877041 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ef1b4b-5400-4c4c-8efe-58a442949c8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.147981 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerID="d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e" exitCode=0 Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.148808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerDied","Data":"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e"} Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.148851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdzd4" event={"ID":"d8ef1b4b-5400-4c4c-8efe-58a442949c8e","Type":"ContainerDied","Data":"8f1a9e22216a3cc2f2198aee428589d2be972df9ac7fb53f647c80932120ed4f"} Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.148876 4717 scope.go:117] "RemoveContainer" containerID="d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.149018 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdzd4" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.171782 4717 scope.go:117] "RemoveContainer" containerID="00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.189726 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.198346 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mdzd4"] Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.218326 4717 scope.go:117] "RemoveContainer" containerID="34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.278883 4717 scope.go:117] "RemoveContainer" containerID="d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e" Mar 08 07:09:29 crc kubenswrapper[4717]: E0308 07:09:29.279337 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e\": container with ID starting with d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e not found: ID does not exist" containerID="d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.279367 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e"} err="failed to get container status \"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e\": rpc error: code = NotFound desc = could not find container \"d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e\": container with ID starting with d9377a4cbcb090f6a27e9162492afb0570e72e6404934e08c40883b06ce3b67e not found: ID does not exist" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.279390 4717 scope.go:117] "RemoveContainer" containerID="00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e" Mar 08 07:09:29 crc kubenswrapper[4717]: E0308 07:09:29.279642 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e\": container with ID starting with 00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e not found: ID does not exist" containerID="00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.279701 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e"} err="failed to get container status \"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e\": rpc error: code = NotFound desc = could not find container \"00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e\": container with ID starting with 00a5dc3876513f1ed957894026c51e06389f97561931fc46e11acec02ff4c63e not found: ID does not exist" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.279730 4717 scope.go:117] "RemoveContainer" containerID="34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57" Mar 08 07:09:29 crc kubenswrapper[4717]: E0308 07:09:29.279953 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57\": container with ID starting with 34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57 not found: ID does not exist" containerID="34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.279970 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57"} err="failed to get container status \"34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57\": rpc error: code = NotFound desc = could not find container \"34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57\": container with ID starting with 34b2373f07640ad3361501ca0357e74edfbb3a1f9a0e7fffc8d5655c09776c57 not found: ID does not exist" Mar 08 07:09:29 crc kubenswrapper[4717]: E0308 07:09:29.286764 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ef1b4b_5400_4c4c_8efe_58a442949c8e.slice/crio-8f1a9e22216a3cc2f2198aee428589d2be972df9ac7fb53f647c80932120ed4f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ef1b4b_5400_4c4c_8efe_58a442949c8e.slice\": RecentStats: unable to find data in memory cache]" Mar 08 07:09:29 crc kubenswrapper[4717]: I0308 07:09:29.792140 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" path="/var/lib/kubelet/pods/d8ef1b4b-5400-4c4c-8efe-58a442949c8e/volumes" Mar 08 07:09:33 crc kubenswrapper[4717]: I0308 07:09:33.382343 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hhrcp_0507da4e-a2d5-43c2-b5e2-25f42085431c/control-plane-machine-set-operator/0.log" Mar 08 07:09:33 crc kubenswrapper[4717]: I0308 07:09:33.556541 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ppg2t_4ca60946-75d5-469e-84f0-d200ca8c0cfd/kube-rbac-proxy/0.log" Mar 08 07:09:33 crc kubenswrapper[4717]: I0308 07:09:33.580246 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ppg2t_4ca60946-75d5-469e-84f0-d200ca8c0cfd/machine-api-operator/0.log" Mar 08 07:09:34 crc kubenswrapper[4717]: I0308 07:09:34.120490 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:09:34 crc kubenswrapper[4717]: I0308 07:09:34.120550 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:09:47 crc kubenswrapper[4717]: I0308 07:09:47.814414 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-q9jgr_af96d97e-e051-406c-b0f5-c9d59fb60bfa/cert-manager-controller/0.log" Mar 08 07:09:47 crc kubenswrapper[4717]: I0308 07:09:47.973603 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-w5ms2_1e9abf00-821a-412c-b6da-fa5c1f1a568a/cert-manager-cainjector/0.log" Mar 08 07:09:48 crc kubenswrapper[4717]: I0308 07:09:48.023587 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-z8plj_8815e0c5-e3aa-4015-95c2-e2091a21ef2f/cert-manager-webhook/0.log" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.168959 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549230-jsb68"] Mar 08 07:10:00 crc kubenswrapper[4717]: E0308 07:10:00.169842 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="extract-content" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.169854 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="extract-content" Mar 08 07:10:00 crc kubenswrapper[4717]: E0308 07:10:00.169866 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="extract-utilities" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.169873 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="extract-utilities" Mar 08 07:10:00 crc kubenswrapper[4717]: E0308 07:10:00.169886 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="registry-server" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.169892 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="registry-server" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.170079 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ef1b4b-5400-4c4c-8efe-58a442949c8e" containerName="registry-server" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.170783 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.173754 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.173780 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.173970 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.183588 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549230-jsb68"] Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.296552 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knclp\" (UniqueName: \"kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp\") pod \"auto-csr-approver-29549230-jsb68\" (UID: \"578d3092-82b0-4cb5-b1d0-8f34a4bdc871\") " pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.398530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knclp\" (UniqueName: \"kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp\") pod \"auto-csr-approver-29549230-jsb68\" (UID: \"578d3092-82b0-4cb5-b1d0-8f34a4bdc871\") " pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.426583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knclp\" (UniqueName: \"kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp\") pod \"auto-csr-approver-29549230-jsb68\" (UID: \"578d3092-82b0-4cb5-b1d0-8f34a4bdc871\") " pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.495633 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:00 crc kubenswrapper[4717]: I0308 07:10:00.977269 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549230-jsb68"] Mar 08 07:10:00 crc kubenswrapper[4717]: W0308 07:10:00.980443 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod578d3092_82b0_4cb5_b1d0_8f34a4bdc871.slice/crio-44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc WatchSource:0}: Error finding container 44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc: Status 404 returned error can't find the container with id 44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc Mar 08 07:10:01 crc kubenswrapper[4717]: I0308 07:10:01.490788 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549230-jsb68" event={"ID":"578d3092-82b0-4cb5-b1d0-8f34a4bdc871","Type":"ContainerStarted","Data":"44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc"} Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.108371 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-gcpqm_a99ee055-c0a9-4a9b-8787-45f90f0e41f0/nmstate-console-plugin/0.log" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.278067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-nvc4p_474b5a28-e5de-4fdc-814e-588f604686f4/kube-rbac-proxy/0.log" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.285414 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-57gbg_18fa48fe-7964-43d4-8e35-f0e459dd40ea/nmstate-handler/0.log" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.398496 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-nvc4p_474b5a28-e5de-4fdc-814e-588f604686f4/nmstate-metrics/0.log" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.499300 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549230-jsb68" event={"ID":"578d3092-82b0-4cb5-b1d0-8f34a4bdc871","Type":"ContainerStarted","Data":"438ee82ba6576f50ea145e22e1fc297f271c54fb76d518e92640d6ada2fc1761"} Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.509251 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-qt88p_396ff7f1-399f-4510-96a1-d17996841dba/nmstate-operator/0.log" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.516260 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549230-jsb68" podStartSLOduration=1.338277681 podStartE2EDuration="2.51624546s" podCreationTimestamp="2026-03-08 07:10:00 +0000 UTC" firstStartedPulling="2026-03-08 07:10:00.98243575 +0000 UTC m=+6227.900084594" lastFinishedPulling="2026-03-08 07:10:02.160403529 +0000 UTC m=+6229.078052373" observedRunningTime="2026-03-08 07:10:02.51380047 +0000 UTC m=+6229.431449314" watchObservedRunningTime="2026-03-08 07:10:02.51624546 +0000 UTC m=+6229.433894304" Mar 08 07:10:02 crc kubenswrapper[4717]: I0308 07:10:02.632232 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-w6x5m_3e38f069-dcb2-471a-9124-87af836a0e11/nmstate-webhook/0.log" Mar 08 07:10:03 crc kubenswrapper[4717]: I0308 07:10:03.509017 4717 generic.go:334] "Generic (PLEG): container finished" podID="578d3092-82b0-4cb5-b1d0-8f34a4bdc871" containerID="438ee82ba6576f50ea145e22e1fc297f271c54fb76d518e92640d6ada2fc1761" exitCode=0 Mar 08 07:10:03 crc kubenswrapper[4717]: I0308 07:10:03.509111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549230-jsb68" event={"ID":"578d3092-82b0-4cb5-b1d0-8f34a4bdc871","Type":"ContainerDied","Data":"438ee82ba6576f50ea145e22e1fc297f271c54fb76d518e92640d6ada2fc1761"} Mar 08 07:10:04 crc kubenswrapper[4717]: I0308 07:10:04.120176 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:10:04 crc kubenswrapper[4717]: I0308 07:10:04.120617 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:10:04 crc kubenswrapper[4717]: I0308 07:10:04.872528 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.018455 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knclp\" (UniqueName: \"kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp\") pod \"578d3092-82b0-4cb5-b1d0-8f34a4bdc871\" (UID: \"578d3092-82b0-4cb5-b1d0-8f34a4bdc871\") " Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.032741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp" (OuterVolumeSpecName: "kube-api-access-knclp") pod "578d3092-82b0-4cb5-b1d0-8f34a4bdc871" (UID: "578d3092-82b0-4cb5-b1d0-8f34a4bdc871"). InnerVolumeSpecName "kube-api-access-knclp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.121276 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knclp\" (UniqueName: \"kubernetes.io/projected/578d3092-82b0-4cb5-b1d0-8f34a4bdc871-kube-api-access-knclp\") on node \"crc\" DevicePath \"\"" Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.526447 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549230-jsb68" event={"ID":"578d3092-82b0-4cb5-b1d0-8f34a4bdc871","Type":"ContainerDied","Data":"44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc"} Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.526490 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44aba9a6b4b2432819b1aaa1d9216b96ffdd007f5d57117c63425ce9bff687fc" Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.526521 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549230-jsb68" Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.594748 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549224-dr225"] Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.604004 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549224-dr225"] Mar 08 07:10:05 crc kubenswrapper[4717]: I0308 07:10:05.792040 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d07649a-00a6-490f-a0ff-7386ab7ca669" path="/var/lib/kubelet/pods/0d07649a-00a6-490f-a0ff-7386ab7ca669/volumes" Mar 08 07:10:18 crc kubenswrapper[4717]: I0308 07:10:18.258964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9mtbk_186bbba6-72b1-4834-9f78-65c0099a8be8/prometheus-operator/0.log" Mar 08 07:10:18 crc kubenswrapper[4717]: I0308 07:10:18.459884 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp_dad3a63c-7244-41bd-85d4-38046d2ecf3f/prometheus-operator-admission-webhook/0.log" Mar 08 07:10:18 crc kubenswrapper[4717]: I0308 07:10:18.512034 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd_0e21fc32-f762-4f29-9ed5-7ab0e28be6a7/prometheus-operator-admission-webhook/0.log" Mar 08 07:10:18 crc kubenswrapper[4717]: I0308 07:10:18.673453 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pjlrw_5eaaec5c-b81f-4400-8237-3cb96bac6a73/operator/0.log" Mar 08 07:10:18 crc kubenswrapper[4717]: I0308 07:10:18.730470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hmpm8_6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5/perses-operator/0.log" Mar 08 07:10:33 crc kubenswrapper[4717]: I0308 07:10:33.642088 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-96ksw_65f01f58-dbf5-4547-9249-ab613d4f85db/kube-rbac-proxy/0.log" Mar 08 07:10:33 crc kubenswrapper[4717]: I0308 07:10:33.709195 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-96ksw_65f01f58-dbf5-4547-9249-ab613d4f85db/controller/0.log" Mar 08 07:10:33 crc kubenswrapper[4717]: I0308 07:10:33.833547 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-vwtjg_88b4e6f2-24f0-4e67-ab40-e3621ab5b44f/frr-k8s-webhook-server/0.log" Mar 08 07:10:33 crc kubenswrapper[4717]: I0308 07:10:33.910854 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.110305 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.120121 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.120176 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.120215 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.120951 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.121018 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597" gracePeriod=600 Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.150380 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.151029 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.156148 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.371778 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.427222 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.460103 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.478893 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.667570 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-frr-files/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.714973 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-metrics/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.723629 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/cp-reloader/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.760139 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/controller/0.log" Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.876132 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597" exitCode=0 Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.876474 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597"} Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.876504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerStarted","Data":"8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17"} Mar 08 07:10:34 crc kubenswrapper[4717]: I0308 07:10:34.876524 4717 scope.go:117] "RemoveContainer" containerID="8d67012fff91876bdfa5a0d5cf8e0c49c80aff79418083e065d909697290462f" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.104326 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/frr-metrics/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.166369 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/kube-rbac-proxy-frr/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.170079 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/kube-rbac-proxy/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.302212 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/reloader/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.415538 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d59c89549-fbpjz_de9b0a52-bf0f-4566-bcb4-f52c31916a41/manager/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.588591 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f74747698-24c78_3114eda7-af43-45d9-955c-116f643af398/webhook-server/0.log" Mar 08 07:10:35 crc kubenswrapper[4717]: I0308 07:10:35.816147 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27q99_b4a0e98d-c9c3-4d97-ab3a-cd63903fd104/kube-rbac-proxy/0.log" Mar 08 07:10:36 crc kubenswrapper[4717]: I0308 07:10:36.298085 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-27q99_b4a0e98d-c9c3-4d97-ab3a-cd63903fd104/speaker/0.log" Mar 08 07:10:37 crc kubenswrapper[4717]: I0308 07:10:37.078355 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wqp4w_804f0686-e4ef-4cd6-bbe2-a2e7788759e2/frr/0.log" Mar 08 07:10:42 crc kubenswrapper[4717]: I0308 07:10:42.812537 4717 scope.go:117] "RemoveContainer" containerID="99c33235fa34c78bde25c11d6e8447fa34d936525be1d95e073c021643372bb4" Mar 08 07:10:42 crc kubenswrapper[4717]: I0308 07:10:42.869230 4717 scope.go:117] "RemoveContainer" containerID="49d0dff9cf8b1fb3f20d4ad75b643ad34f03e519141c94afba54f87930300b0b" Mar 08 07:10:42 crc kubenswrapper[4717]: I0308 07:10:42.919493 4717 scope.go:117] "RemoveContainer" containerID="989115562e9f7d9dfb7966e46b47df99e53718a2e5b4a416bd7f78d34444b628" Mar 08 07:10:42 crc kubenswrapper[4717]: I0308 07:10:42.967516 4717 scope.go:117] "RemoveContainer" containerID="050ddc11b88294df219e40b1357f570b83be80e7e264e3153755c30021f5d9aa" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.441786 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.660678 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.664158 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.700328 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.883306 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/pull/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.889394 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/util/0.log" Mar 08 07:10:49 crc kubenswrapper[4717]: I0308 07:10:49.903662 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a8296rsd_ea43c0a0-25a8-4da0-b2a3-f94c285c9e58/extract/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.027989 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.212859 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.215550 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.229143 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.400428 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/util/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.429233 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/pull/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.455346 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084wb6c_8ab2fa77-0e5e-4c32-86af-eacf41b1902e/extract/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.593391 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.759063 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.762253 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.785794 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.953214 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-utilities/0.log" Mar 08 07:10:50 crc kubenswrapper[4717]: I0308 07:10:50.981570 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/extract-content/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.146597 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.380956 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.406725 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.441656 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.638915 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-utilities/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.664967 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5skl_ff534094-b1ae-4777-955f-322d8f2bfc65/registry-server/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.691249 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/extract-content/0.log" Mar 08 07:10:51 crc kubenswrapper[4717]: I0308 07:10:51.883189 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.129788 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.135348 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.169582 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpcgg_4e9cba03-4895-49ad-ae18-c6d5ebd55311/registry-server/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.185595 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.311161 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/pull/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.332520 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/util/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.340266 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f47lpvt_0531d9bf-3f78-45d6-af95-ec8b54e8fc1e/extract/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.535093 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cmqxx_61147cf3-b98d-4c9f-a053-2d818468c5e0/marketplace-operator/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.554674 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.705510 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.716440 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.737244 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.907311 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-utilities/0.log" Mar 08 07:10:52 crc kubenswrapper[4717]: I0308 07:10:52.988901 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/extract-content/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.105774 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bvmlr_13ec80f4-5952-4e71-8aaa-18643bdfae3d/registry-server/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.171525 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.362039 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.398523 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.432547 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.622631 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-utilities/0.log" Mar 08 07:10:53 crc kubenswrapper[4717]: I0308 07:10:53.634202 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/extract-content/0.log" Mar 08 07:10:54 crc kubenswrapper[4717]: I0308 07:10:54.266868 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5chdv_877ad6e6-1569-4e9c-a1fb-a2226718fa2d/registry-server/0.log" Mar 08 07:11:06 crc kubenswrapper[4717]: I0308 07:11:06.874493 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9mtbk_186bbba6-72b1-4834-9f78-65c0099a8be8/prometheus-operator/0.log" Mar 08 07:11:06 crc kubenswrapper[4717]: I0308 07:11:06.917723 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-9slmp_dad3a63c-7244-41bd-85d4-38046d2ecf3f/prometheus-operator-admission-webhook/0.log" Mar 08 07:11:06 crc kubenswrapper[4717]: I0308 07:11:06.935784 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-659cd7cbf8-qwrgd_0e21fc32-f762-4f29-9ed5-7ab0e28be6a7/prometheus-operator-admission-webhook/0.log" Mar 08 07:11:07 crc kubenswrapper[4717]: I0308 07:11:07.110468 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pjlrw_5eaaec5c-b81f-4400-8237-3cb96bac6a73/operator/0.log" Mar 08 07:11:07 crc kubenswrapper[4717]: I0308 07:11:07.113365 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-hmpm8_6dfc30c5-04dd-4c4f-96ad-9ebdbaf84dd5/perses-operator/0.log" Mar 08 07:11:24 crc kubenswrapper[4717]: I0308 07:11:24.922237 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:24 crc kubenswrapper[4717]: E0308 07:11:24.923404 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578d3092-82b0-4cb5-b1d0-8f34a4bdc871" containerName="oc" Mar 08 07:11:24 crc kubenswrapper[4717]: I0308 07:11:24.923419 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="578d3092-82b0-4cb5-b1d0-8f34a4bdc871" containerName="oc" Mar 08 07:11:24 crc kubenswrapper[4717]: I0308 07:11:24.923674 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="578d3092-82b0-4cb5-b1d0-8f34a4bdc871" containerName="oc" Mar 08 07:11:24 crc kubenswrapper[4717]: I0308 07:11:24.927030 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:24 crc kubenswrapper[4717]: I0308 07:11:24.951950 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.053338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.053471 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vjk\" (UniqueName: \"kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.053770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.155875 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.155986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vjk\" (UniqueName: \"kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.156054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.157133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.157401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.188559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vjk\" (UniqueName: \"kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk\") pod \"redhat-operators-knmcb\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.280674 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:25 crc kubenswrapper[4717]: I0308 07:11:25.775141 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:26 crc kubenswrapper[4717]: I0308 07:11:26.407202 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b764833-e1b5-477c-a546-914ea6a1709a" containerID="8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393" exitCode=0 Mar 08 07:11:26 crc kubenswrapper[4717]: I0308 07:11:26.407469 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerDied","Data":"8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393"} Mar 08 07:11:26 crc kubenswrapper[4717]: I0308 07:11:26.407496 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerStarted","Data":"64f3e51c5a4e9f2619d1bc6bcb507695f4ad1ecf7b8072f1c0d164a0b8548522"} Mar 08 07:11:27 crc kubenswrapper[4717]: I0308 07:11:27.429421 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerStarted","Data":"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e"} Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.522746 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.524952 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.544925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.619822 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.620026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.620125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4st9b\" (UniqueName: \"kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.721760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.721864 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.721892 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4st9b\" (UniqueName: \"kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.722454 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.722512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.749482 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4st9b\" (UniqueName: \"kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b\") pod \"community-operators-fsq7g\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:28 crc kubenswrapper[4717]: I0308 07:11:28.855371 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:29 crc kubenswrapper[4717]: I0308 07:11:29.439017 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:30 crc kubenswrapper[4717]: I0308 07:11:30.461740 4717 generic.go:334] "Generic (PLEG): container finished" podID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerID="98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb" exitCode=0 Mar 08 07:11:30 crc kubenswrapper[4717]: I0308 07:11:30.461787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerDied","Data":"98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb"} Mar 08 07:11:30 crc kubenswrapper[4717]: I0308 07:11:30.462265 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerStarted","Data":"f93999359a2403785fa629c8d032b0679a4a01ceff94bd80e15e9bd7ed17af09"} Mar 08 07:11:31 crc kubenswrapper[4717]: I0308 07:11:31.476773 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerStarted","Data":"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29"} Mar 08 07:11:32 crc kubenswrapper[4717]: I0308 07:11:32.489925 4717 generic.go:334] "Generic (PLEG): container finished" podID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerID="3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29" exitCode=0 Mar 08 07:11:32 crc kubenswrapper[4717]: I0308 07:11:32.490027 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerDied","Data":"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29"} Mar 08 07:11:32 crc kubenswrapper[4717]: I0308 07:11:32.496412 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b764833-e1b5-477c-a546-914ea6a1709a" containerID="a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e" exitCode=0 Mar 08 07:11:32 crc kubenswrapper[4717]: I0308 07:11:32.496457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerDied","Data":"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e"} Mar 08 07:11:33 crc kubenswrapper[4717]: I0308 07:11:33.513139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerStarted","Data":"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a"} Mar 08 07:11:33 crc kubenswrapper[4717]: I0308 07:11:33.518381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerStarted","Data":"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf"} Mar 08 07:11:34 crc kubenswrapper[4717]: I0308 07:11:34.568979 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knmcb" podStartSLOduration=3.97418203 podStartE2EDuration="10.5689518s" podCreationTimestamp="2026-03-08 07:11:24 +0000 UTC" firstStartedPulling="2026-03-08 07:11:26.414055325 +0000 UTC m=+6313.331704169" lastFinishedPulling="2026-03-08 07:11:33.008825095 +0000 UTC m=+6319.926473939" observedRunningTime="2026-03-08 07:11:34.557906729 +0000 UTC m=+6321.475555613" watchObservedRunningTime="2026-03-08 07:11:34.5689518 +0000 UTC m=+6321.486600654" Mar 08 07:11:34 crc kubenswrapper[4717]: I0308 07:11:34.602930 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fsq7g" podStartSLOduration=4.100600891 podStartE2EDuration="6.602907551s" podCreationTimestamp="2026-03-08 07:11:28 +0000 UTC" firstStartedPulling="2026-03-08 07:11:30.463575454 +0000 UTC m=+6317.381224298" lastFinishedPulling="2026-03-08 07:11:32.965882114 +0000 UTC m=+6319.883530958" observedRunningTime="2026-03-08 07:11:34.594351901 +0000 UTC m=+6321.512000755" watchObservedRunningTime="2026-03-08 07:11:34.602907551 +0000 UTC m=+6321.520556395" Mar 08 07:11:35 crc kubenswrapper[4717]: I0308 07:11:35.281843 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:35 crc kubenswrapper[4717]: I0308 07:11:35.281917 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:36 crc kubenswrapper[4717]: I0308 07:11:36.342290 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knmcb" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="registry-server" probeResult="failure" output=< Mar 08 07:11:36 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Mar 08 07:11:36 crc kubenswrapper[4717]: > Mar 08 07:11:38 crc kubenswrapper[4717]: I0308 07:11:38.856139 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:38 crc kubenswrapper[4717]: I0308 07:11:38.857883 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:38 crc kubenswrapper[4717]: I0308 07:11:38.917318 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:39 crc kubenswrapper[4717]: I0308 07:11:39.644940 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:39 crc kubenswrapper[4717]: I0308 07:11:39.705362 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:41 crc kubenswrapper[4717]: I0308 07:11:41.606323 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fsq7g" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="registry-server" containerID="cri-o://db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf" gracePeriod=2 Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.112330 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.251899 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities\") pod \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.251946 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content\") pod \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.252237 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4st9b\" (UniqueName: \"kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b\") pod \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\" (UID: \"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66\") " Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.253895 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities" (OuterVolumeSpecName: "utilities") pod "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" (UID: "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.261933 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b" (OuterVolumeSpecName: "kube-api-access-4st9b") pod "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" (UID: "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66"). InnerVolumeSpecName "kube-api-access-4st9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.336157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" (UID: "919ed64b-2e76-4aaf-ab86-dfdfc7c82b66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.354873 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4st9b\" (UniqueName: \"kubernetes.io/projected/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-kube-api-access-4st9b\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.354905 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.354922 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.619046 4717 generic.go:334] "Generic (PLEG): container finished" podID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerID="db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf" exitCode=0 Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.619123 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fsq7g" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.619135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerDied","Data":"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf"} Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.619517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fsq7g" event={"ID":"919ed64b-2e76-4aaf-ab86-dfdfc7c82b66","Type":"ContainerDied","Data":"f93999359a2403785fa629c8d032b0679a4a01ceff94bd80e15e9bd7ed17af09"} Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.619541 4717 scope.go:117] "RemoveContainer" containerID="db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.646416 4717 scope.go:117] "RemoveContainer" containerID="3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.669205 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.682102 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fsq7g"] Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.692591 4717 scope.go:117] "RemoveContainer" containerID="98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.729419 4717 scope.go:117] "RemoveContainer" containerID="db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf" Mar 08 07:11:42 crc kubenswrapper[4717]: E0308 07:11:42.730004 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf\": container with ID starting with db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf not found: ID does not exist" containerID="db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.730033 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf"} err="failed to get container status \"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf\": rpc error: code = NotFound desc = could not find container \"db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf\": container with ID starting with db63d7c620c0fdfcfe17e699c4a997c796b57e406d5ce6703e33eaef437d9fbf not found: ID does not exist" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.730075 4717 scope.go:117] "RemoveContainer" containerID="3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29" Mar 08 07:11:42 crc kubenswrapper[4717]: E0308 07:11:42.730361 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29\": container with ID starting with 3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29 not found: ID does not exist" containerID="3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.730416 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29"} err="failed to get container status \"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29\": rpc error: code = NotFound desc = could not find container \"3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29\": container with ID starting with 3acee54ad3f0b65418d7504e00ba07ec05169f0fb6a35adcc45a121110241d29 not found: ID does not exist" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.730436 4717 scope.go:117] "RemoveContainer" containerID="98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb" Mar 08 07:11:42 crc kubenswrapper[4717]: E0308 07:11:42.731109 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb\": container with ID starting with 98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb not found: ID does not exist" containerID="98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb" Mar 08 07:11:42 crc kubenswrapper[4717]: I0308 07:11:42.731184 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb"} err="failed to get container status \"98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb\": rpc error: code = NotFound desc = could not find container \"98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb\": container with ID starting with 98b359b265568bdc8620789e2ef84067424409c6d8ddbccca62851498eeb55fb not found: ID does not exist" Mar 08 07:11:42 crc kubenswrapper[4717]: E0308 07:11:42.826421 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919ed64b_2e76_4aaf_ab86_dfdfc7c82b66.slice/crio-f93999359a2403785fa629c8d032b0679a4a01ceff94bd80e15e9bd7ed17af09\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919ed64b_2e76_4aaf_ab86_dfdfc7c82b66.slice\": RecentStats: unable to find data in memory cache]" Mar 08 07:11:43 crc kubenswrapper[4717]: I0308 07:11:43.815546 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" path="/var/lib/kubelet/pods/919ed64b-2e76-4aaf-ab86-dfdfc7c82b66/volumes" Mar 08 07:11:45 crc kubenswrapper[4717]: I0308 07:11:45.369891 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:45 crc kubenswrapper[4717]: I0308 07:11:45.466230 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:45 crc kubenswrapper[4717]: I0308 07:11:45.712703 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:46 crc kubenswrapper[4717]: I0308 07:11:46.672433 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knmcb" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="registry-server" containerID="cri-o://21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a" gracePeriod=2 Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.201983 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.272900 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content\") pod \"8b764833-e1b5-477c-a546-914ea6a1709a\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.272952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities\") pod \"8b764833-e1b5-477c-a546-914ea6a1709a\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.273048 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8vjk\" (UniqueName: \"kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk\") pod \"8b764833-e1b5-477c-a546-914ea6a1709a\" (UID: \"8b764833-e1b5-477c-a546-914ea6a1709a\") " Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.274511 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities" (OuterVolumeSpecName: "utilities") pod "8b764833-e1b5-477c-a546-914ea6a1709a" (UID: "8b764833-e1b5-477c-a546-914ea6a1709a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.298006 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk" (OuterVolumeSpecName: "kube-api-access-r8vjk") pod "8b764833-e1b5-477c-a546-914ea6a1709a" (UID: "8b764833-e1b5-477c-a546-914ea6a1709a"). InnerVolumeSpecName "kube-api-access-r8vjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.374991 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.375021 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8vjk\" (UniqueName: \"kubernetes.io/projected/8b764833-e1b5-477c-a546-914ea6a1709a-kube-api-access-r8vjk\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.436653 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b764833-e1b5-477c-a546-914ea6a1709a" (UID: "8b764833-e1b5-477c-a546-914ea6a1709a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.476563 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b764833-e1b5-477c-a546-914ea6a1709a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.700970 4717 generic.go:334] "Generic (PLEG): container finished" podID="8b764833-e1b5-477c-a546-914ea6a1709a" containerID="21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a" exitCode=0 Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.701017 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerDied","Data":"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a"} Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.701041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knmcb" event={"ID":"8b764833-e1b5-477c-a546-914ea6a1709a","Type":"ContainerDied","Data":"64f3e51c5a4e9f2619d1bc6bcb507695f4ad1ecf7b8072f1c0d164a0b8548522"} Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.701058 4717 scope.go:117] "RemoveContainer" containerID="21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.701065 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knmcb" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.734121 4717 scope.go:117] "RemoveContainer" containerID="a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.756375 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.768318 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knmcb"] Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.773092 4717 scope.go:117] "RemoveContainer" containerID="8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.793508 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" path="/var/lib/kubelet/pods/8b764833-e1b5-477c-a546-914ea6a1709a/volumes" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.829129 4717 scope.go:117] "RemoveContainer" containerID="21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a" Mar 08 07:11:47 crc kubenswrapper[4717]: E0308 07:11:47.829572 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a\": container with ID starting with 21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a not found: ID does not exist" containerID="21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.829614 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a"} err="failed to get container status \"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a\": rpc error: code = NotFound desc = could not find container \"21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a\": container with ID starting with 21c5e7824db2ecb5e2af5578415452631ae815763177b9081857fb7b1e8d7f8a not found: ID does not exist" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.829641 4717 scope.go:117] "RemoveContainer" containerID="a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e" Mar 08 07:11:47 crc kubenswrapper[4717]: E0308 07:11:47.830016 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e\": container with ID starting with a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e not found: ID does not exist" containerID="a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.830075 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e"} err="failed to get container status \"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e\": rpc error: code = NotFound desc = could not find container \"a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e\": container with ID starting with a81418425db08956d3c5bb1e70ea9ac5f36ab33a8b39a1f48a819e631cdc4a0e not found: ID does not exist" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.830110 4717 scope.go:117] "RemoveContainer" containerID="8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393" Mar 08 07:11:47 crc kubenswrapper[4717]: E0308 07:11:47.830382 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393\": container with ID starting with 8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393 not found: ID does not exist" containerID="8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393" Mar 08 07:11:47 crc kubenswrapper[4717]: I0308 07:11:47.830414 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393"} err="failed to get container status \"8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393\": rpc error: code = NotFound desc = could not find container \"8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393\": container with ID starting with 8fcac3708635474ab5fcd5866d4a039545403f2af56e9f546b6e2bdc1a8c8393 not found: ID does not exist" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.175266 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549232-gw7st"] Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176578 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176600 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176625 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="extract-utilities" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176635 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="extract-utilities" Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176654 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="extract-content" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176663 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="extract-content" Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176726 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="extract-utilities" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176737 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="extract-utilities" Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176774 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176784 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: E0308 07:12:00.176799 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="extract-content" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.176809 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="extract-content" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.177106 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b764833-e1b5-477c-a546-914ea6a1709a" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.177141 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="919ed64b-2e76-4aaf-ab86-dfdfc7c82b66" containerName="registry-server" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.178193 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.181296 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.181487 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.181553 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.199863 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549232-gw7st"] Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.204397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgl7\" (UniqueName: \"kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7\") pod \"auto-csr-approver-29549232-gw7st\" (UID: \"7476f9a4-fc17-4134-abda-275e7abd1f43\") " pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.305460 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgl7\" (UniqueName: \"kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7\") pod \"auto-csr-approver-29549232-gw7st\" (UID: \"7476f9a4-fc17-4134-abda-275e7abd1f43\") " pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.328138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgl7\" (UniqueName: \"kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7\") pod \"auto-csr-approver-29549232-gw7st\" (UID: \"7476f9a4-fc17-4134-abda-275e7abd1f43\") " pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:00 crc kubenswrapper[4717]: I0308 07:12:00.523942 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:01 crc kubenswrapper[4717]: I0308 07:12:01.030866 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549232-gw7st"] Mar 08 07:12:01 crc kubenswrapper[4717]: I0308 07:12:01.866062 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549232-gw7st" event={"ID":"7476f9a4-fc17-4134-abda-275e7abd1f43","Type":"ContainerStarted","Data":"978cd693522fbaf594e8a96a9cbfdeaa9e8751333602172eebae151e5e5c89a2"} Mar 08 07:12:02 crc kubenswrapper[4717]: I0308 07:12:02.881885 4717 generic.go:334] "Generic (PLEG): container finished" podID="7476f9a4-fc17-4134-abda-275e7abd1f43" containerID="2aee824acf25cf8b544cd4b91faf25339fa3312279681e92c4b51d8f728d6147" exitCode=0 Mar 08 07:12:02 crc kubenswrapper[4717]: I0308 07:12:02.882225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549232-gw7st" event={"ID":"7476f9a4-fc17-4134-abda-275e7abd1f43","Type":"ContainerDied","Data":"2aee824acf25cf8b544cd4b91faf25339fa3312279681e92c4b51d8f728d6147"} Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.265110 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.397498 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgl7\" (UniqueName: \"kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7\") pod \"7476f9a4-fc17-4134-abda-275e7abd1f43\" (UID: \"7476f9a4-fc17-4134-abda-275e7abd1f43\") " Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.404960 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7" (OuterVolumeSpecName: "kube-api-access-zsgl7") pod "7476f9a4-fc17-4134-abda-275e7abd1f43" (UID: "7476f9a4-fc17-4134-abda-275e7abd1f43"). InnerVolumeSpecName "kube-api-access-zsgl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.500528 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsgl7\" (UniqueName: \"kubernetes.io/projected/7476f9a4-fc17-4134-abda-275e7abd1f43-kube-api-access-zsgl7\") on node \"crc\" DevicePath \"\"" Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.920822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549232-gw7st" event={"ID":"7476f9a4-fc17-4134-abda-275e7abd1f43","Type":"ContainerDied","Data":"978cd693522fbaf594e8a96a9cbfdeaa9e8751333602172eebae151e5e5c89a2"} Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.920870 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978cd693522fbaf594e8a96a9cbfdeaa9e8751333602172eebae151e5e5c89a2" Mar 08 07:12:04 crc kubenswrapper[4717]: I0308 07:12:04.920931 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549232-gw7st" Mar 08 07:12:05 crc kubenswrapper[4717]: I0308 07:12:05.390511 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549226-9zmmh"] Mar 08 07:12:05 crc kubenswrapper[4717]: I0308 07:12:05.404913 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549226-9zmmh"] Mar 08 07:12:05 crc kubenswrapper[4717]: I0308 07:12:05.794016 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e689532a-2392-48b1-8861-5c985aba7af7" path="/var/lib/kubelet/pods/e689532a-2392-48b1-8861-5c985aba7af7/volumes" Mar 08 07:12:34 crc kubenswrapper[4717]: I0308 07:12:34.120622 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:12:34 crc kubenswrapper[4717]: I0308 07:12:34.121457 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:12:43 crc kubenswrapper[4717]: I0308 07:12:43.110773 4717 scope.go:117] "RemoveContainer" containerID="8a9ff4bbd3fec364b1640e8301e634bc50fc7fd3eb4ed7eb0f6e97503445f899" Mar 08 07:13:04 crc kubenswrapper[4717]: I0308 07:13:04.120327 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:13:04 crc kubenswrapper[4717]: I0308 07:13:04.120822 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:13:11 crc kubenswrapper[4717]: I0308 07:13:11.704917 4717 generic.go:334] "Generic (PLEG): container finished" podID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerID="d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996" exitCode=0 Mar 08 07:13:11 crc kubenswrapper[4717]: I0308 07:13:11.705033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k72vp/must-gather-hcmhm" event={"ID":"299901e4-82e9-4b87-b34a-177d62deaa7b","Type":"ContainerDied","Data":"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996"} Mar 08 07:13:11 crc kubenswrapper[4717]: I0308 07:13:11.706287 4717 scope.go:117] "RemoveContainer" containerID="d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996" Mar 08 07:13:11 crc kubenswrapper[4717]: I0308 07:13:11.837388 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k72vp_must-gather-hcmhm_299901e4-82e9-4b87-b34a-177d62deaa7b/gather/0.log" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.073899 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-k72vp/must-gather-hcmhm"] Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.074997 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-k72vp/must-gather-hcmhm" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="copy" containerID="cri-o://cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263" gracePeriod=2 Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.082933 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-k72vp/must-gather-hcmhm"] Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.533420 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k72vp_must-gather-hcmhm_299901e4-82e9-4b87-b34a-177d62deaa7b/copy/0.log" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.534166 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.642914 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2\") pod \"299901e4-82e9-4b87-b34a-177d62deaa7b\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.643149 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output\") pod \"299901e4-82e9-4b87-b34a-177d62deaa7b\" (UID: \"299901e4-82e9-4b87-b34a-177d62deaa7b\") " Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.654443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2" (OuterVolumeSpecName: "kube-api-access-5mqr2") pod "299901e4-82e9-4b87-b34a-177d62deaa7b" (UID: "299901e4-82e9-4b87-b34a-177d62deaa7b"). InnerVolumeSpecName "kube-api-access-5mqr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.745212 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/299901e4-82e9-4b87-b34a-177d62deaa7b-kube-api-access-5mqr2\") on node \"crc\" DevicePath \"\"" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.835823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "299901e4-82e9-4b87-b34a-177d62deaa7b" (UID: "299901e4-82e9-4b87-b34a-177d62deaa7b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.842596 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-k72vp_must-gather-hcmhm_299901e4-82e9-4b87-b34a-177d62deaa7b/copy/0.log" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.842942 4717 generic.go:334] "Generic (PLEG): container finished" podID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerID="cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263" exitCode=143 Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.842988 4717 scope.go:117] "RemoveContainer" containerID="cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.843036 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k72vp/must-gather-hcmhm" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.847585 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/299901e4-82e9-4b87-b34a-177d62deaa7b-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.859530 4717 scope.go:117] "RemoveContainer" containerID="d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.940007 4717 scope.go:117] "RemoveContainer" containerID="cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263" Mar 08 07:13:23 crc kubenswrapper[4717]: E0308 07:13:23.940637 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263\": container with ID starting with cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263 not found: ID does not exist" containerID="cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.940680 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263"} err="failed to get container status \"cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263\": rpc error: code = NotFound desc = could not find container \"cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263\": container with ID starting with cbd04aa8ebd63cf8e3b79cb3df0b928a9302d91fed5c6984ee9eec4b3f0b7263 not found: ID does not exist" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.940720 4717 scope.go:117] "RemoveContainer" containerID="d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996" Mar 08 07:13:23 crc kubenswrapper[4717]: E0308 07:13:23.941136 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996\": container with ID starting with d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996 not found: ID does not exist" containerID="d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996" Mar 08 07:13:23 crc kubenswrapper[4717]: I0308 07:13:23.941187 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996"} err="failed to get container status \"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996\": rpc error: code = NotFound desc = could not find container \"d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996\": container with ID starting with d8a853b3c47a52aeebaee1276b83eca1a2872ebbabd4fac2e4efc7aa5ce43996 not found: ID does not exist" Mar 08 07:13:25 crc kubenswrapper[4717]: I0308 07:13:25.802319 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" path="/var/lib/kubelet/pods/299901e4-82e9-4b87-b34a-177d62deaa7b/volumes" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.120429 4717 patch_prober.go:28] interesting pod/machine-config-daemon-tb7pf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.120933 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.120980 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.121719 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17"} pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.121769 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerName="machine-config-daemon" containerID="cri-o://8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" gracePeriod=600 Mar 08 07:13:34 crc kubenswrapper[4717]: E0308 07:13:34.252573 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.983900 4717 generic.go:334] "Generic (PLEG): container finished" podID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" exitCode=0 Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.983941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" event={"ID":"7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e","Type":"ContainerDied","Data":"8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17"} Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.984226 4717 scope.go:117] "RemoveContainer" containerID="efc60f3a3d797f62c0e0f878c3cc6f0020d779e8e22bd018a1c6e5cf0f474597" Mar 08 07:13:34 crc kubenswrapper[4717]: I0308 07:13:34.984958 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:13:34 crc kubenswrapper[4717]: E0308 07:13:34.985271 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:13:43 crc kubenswrapper[4717]: I0308 07:13:43.249138 4717 scope.go:117] "RemoveContainer" containerID="4b87603a7f42e143849d1a837ae59917645ea6a6a2aa39c97ce336e1c59429f8" Mar 08 07:13:43 crc kubenswrapper[4717]: I0308 07:13:43.284581 4717 scope.go:117] "RemoveContainer" containerID="6aa1d56881f16e87457ac7a3962865e335782787ec84428edff6878f108953a8" Mar 08 07:13:48 crc kubenswrapper[4717]: I0308 07:13:48.781947 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:13:48 crc kubenswrapper[4717]: E0308 07:13:48.783822 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.141187 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549234-pj9mn"] Mar 08 07:14:00 crc kubenswrapper[4717]: E0308 07:14:00.142293 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="copy" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142311 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="copy" Mar 08 07:14:00 crc kubenswrapper[4717]: E0308 07:14:00.142356 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7476f9a4-fc17-4134-abda-275e7abd1f43" containerName="oc" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142366 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7476f9a4-fc17-4134-abda-275e7abd1f43" containerName="oc" Mar 08 07:14:00 crc kubenswrapper[4717]: E0308 07:14:00.142387 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="gather" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142395 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="gather" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142615 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7476f9a4-fc17-4134-abda-275e7abd1f43" containerName="oc" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142635 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="copy" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.142652 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="299901e4-82e9-4b87-b34a-177d62deaa7b" containerName="gather" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.143478 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.150340 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.150670 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.150856 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.153299 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549234-pj9mn"] Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.200056 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59d7t\" (UniqueName: \"kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t\") pod \"auto-csr-approver-29549234-pj9mn\" (UID: \"dc760b16-20ea-47c7-92bf-2339db2cfd67\") " pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.301849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59d7t\" (UniqueName: \"kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t\") pod \"auto-csr-approver-29549234-pj9mn\" (UID: \"dc760b16-20ea-47c7-92bf-2339db2cfd67\") " pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.321754 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59d7t\" (UniqueName: \"kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t\") pod \"auto-csr-approver-29549234-pj9mn\" (UID: \"dc760b16-20ea-47c7-92bf-2339db2cfd67\") " pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.470111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:00 crc kubenswrapper[4717]: I0308 07:14:00.944372 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549234-pj9mn"] Mar 08 07:14:01 crc kubenswrapper[4717]: I0308 07:14:01.249444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" event={"ID":"dc760b16-20ea-47c7-92bf-2339db2cfd67","Type":"ContainerStarted","Data":"a293a8155ccebd00c3300863ae99cd2788764bf6da21a66940b4bb1622792dcc"} Mar 08 07:14:02 crc kubenswrapper[4717]: I0308 07:14:02.260481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" event={"ID":"dc760b16-20ea-47c7-92bf-2339db2cfd67","Type":"ContainerStarted","Data":"6631486095fd078ba20fa779a39e20fe62c2f9eb57031a7d5a95548cff045ba2"} Mar 08 07:14:02 crc kubenswrapper[4717]: I0308 07:14:02.288806 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" podStartSLOduration=1.5148089470000001 podStartE2EDuration="2.288778226s" podCreationTimestamp="2026-03-08 07:14:00 +0000 UTC" firstStartedPulling="2026-03-08 07:14:00.955090575 +0000 UTC m=+6467.872739439" lastFinishedPulling="2026-03-08 07:14:01.729059844 +0000 UTC m=+6468.646708718" observedRunningTime="2026-03-08 07:14:02.280285948 +0000 UTC m=+6469.197934792" watchObservedRunningTime="2026-03-08 07:14:02.288778226 +0000 UTC m=+6469.206427110" Mar 08 07:14:02 crc kubenswrapper[4717]: I0308 07:14:02.782575 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:02 crc kubenswrapper[4717]: E0308 07:14:02.783362 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:03 crc kubenswrapper[4717]: I0308 07:14:03.271625 4717 generic.go:334] "Generic (PLEG): container finished" podID="dc760b16-20ea-47c7-92bf-2339db2cfd67" containerID="6631486095fd078ba20fa779a39e20fe62c2f9eb57031a7d5a95548cff045ba2" exitCode=0 Mar 08 07:14:03 crc kubenswrapper[4717]: I0308 07:14:03.271739 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" event={"ID":"dc760b16-20ea-47c7-92bf-2339db2cfd67","Type":"ContainerDied","Data":"6631486095fd078ba20fa779a39e20fe62c2f9eb57031a7d5a95548cff045ba2"} Mar 08 07:14:04 crc kubenswrapper[4717]: I0308 07:14:04.641863 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:04 crc kubenswrapper[4717]: I0308 07:14:04.733139 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59d7t\" (UniqueName: \"kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t\") pod \"dc760b16-20ea-47c7-92bf-2339db2cfd67\" (UID: \"dc760b16-20ea-47c7-92bf-2339db2cfd67\") " Mar 08 07:14:04 crc kubenswrapper[4717]: I0308 07:14:04.741540 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t" (OuterVolumeSpecName: "kube-api-access-59d7t") pod "dc760b16-20ea-47c7-92bf-2339db2cfd67" (UID: "dc760b16-20ea-47c7-92bf-2339db2cfd67"). InnerVolumeSpecName "kube-api-access-59d7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:14:04 crc kubenswrapper[4717]: I0308 07:14:04.837040 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59d7t\" (UniqueName: \"kubernetes.io/projected/dc760b16-20ea-47c7-92bf-2339db2cfd67-kube-api-access-59d7t\") on node \"crc\" DevicePath \"\"" Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.291927 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" event={"ID":"dc760b16-20ea-47c7-92bf-2339db2cfd67","Type":"ContainerDied","Data":"a293a8155ccebd00c3300863ae99cd2788764bf6da21a66940b4bb1622792dcc"} Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.291972 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a293a8155ccebd00c3300863ae99cd2788764bf6da21a66940b4bb1622792dcc" Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.292066 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549234-pj9mn" Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.356995 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549228-7mmbs"] Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.370079 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549228-7mmbs"] Mar 08 07:14:05 crc kubenswrapper[4717]: I0308 07:14:05.793522 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10188bd2-f97d-4a4c-8695-1920886babf9" path="/var/lib/kubelet/pods/10188bd2-f97d-4a4c-8695-1920886babf9/volumes" Mar 08 07:14:13 crc kubenswrapper[4717]: I0308 07:14:13.809318 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:13 crc kubenswrapper[4717]: E0308 07:14:13.810493 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:25 crc kubenswrapper[4717]: I0308 07:14:25.782950 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:25 crc kubenswrapper[4717]: E0308 07:14:25.790326 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:36 crc kubenswrapper[4717]: I0308 07:14:36.783153 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:36 crc kubenswrapper[4717]: E0308 07:14:36.784705 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:43 crc kubenswrapper[4717]: I0308 07:14:43.477190 4717 scope.go:117] "RemoveContainer" containerID="eb069b774e4977079c1beac49e9fa35a5ade38265da1aafc261c2802375382b2" Mar 08 07:14:48 crc kubenswrapper[4717]: I0308 07:14:48.782796 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:48 crc kubenswrapper[4717]: E0308 07:14:48.784156 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:14:59 crc kubenswrapper[4717]: I0308 07:14:59.782182 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:14:59 crc kubenswrapper[4717]: E0308 07:14:59.783063 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.158468 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t"] Mar 08 07:15:00 crc kubenswrapper[4717]: E0308 07:15:00.159766 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc760b16-20ea-47c7-92bf-2339db2cfd67" containerName="oc" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.159807 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc760b16-20ea-47c7-92bf-2339db2cfd67" containerName="oc" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.160290 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc760b16-20ea-47c7-92bf-2339db2cfd67" containerName="oc" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.161675 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.164222 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.164528 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.173599 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t"] Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.253748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.253836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.253949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbbsq\" (UniqueName: \"kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.356302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbbsq\" (UniqueName: \"kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.356429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.356616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.358043 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.368405 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.381657 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbbsq\" (UniqueName: \"kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq\") pod \"collect-profiles-29549235-l9v9t\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:00 crc kubenswrapper[4717]: I0308 07:15:00.493801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:01 crc kubenswrapper[4717]: I0308 07:15:01.024890 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t"] Mar 08 07:15:01 crc kubenswrapper[4717]: W0308 07:15:01.040729 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba2639b_47a7_4a73_b458_909a68510281.slice/crio-e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9 WatchSource:0}: Error finding container e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9: Status 404 returned error can't find the container with id e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9 Mar 08 07:15:01 crc kubenswrapper[4717]: I0308 07:15:01.922303 4717 generic.go:334] "Generic (PLEG): container finished" podID="4ba2639b-47a7-4a73-b458-909a68510281" containerID="804c12a09fc82895e25e941c4453e4b0eb6e47c831fdb97495cc9c56d3ff5441" exitCode=0 Mar 08 07:15:01 crc kubenswrapper[4717]: I0308 07:15:01.922456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" event={"ID":"4ba2639b-47a7-4a73-b458-909a68510281","Type":"ContainerDied","Data":"804c12a09fc82895e25e941c4453e4b0eb6e47c831fdb97495cc9c56d3ff5441"} Mar 08 07:15:01 crc kubenswrapper[4717]: I0308 07:15:01.922967 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" event={"ID":"4ba2639b-47a7-4a73-b458-909a68510281","Type":"ContainerStarted","Data":"e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9"} Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.361413 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.525061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume\") pod \"4ba2639b-47a7-4a73-b458-909a68510281\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.525217 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume\") pod \"4ba2639b-47a7-4a73-b458-909a68510281\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.525378 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbbsq\" (UniqueName: \"kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq\") pod \"4ba2639b-47a7-4a73-b458-909a68510281\" (UID: \"4ba2639b-47a7-4a73-b458-909a68510281\") " Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.526261 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ba2639b-47a7-4a73-b458-909a68510281" (UID: "4ba2639b-47a7-4a73-b458-909a68510281"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.531340 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ba2639b-47a7-4a73-b458-909a68510281" (UID: "4ba2639b-47a7-4a73-b458-909a68510281"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.532584 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq" (OuterVolumeSpecName: "kube-api-access-vbbsq") pod "4ba2639b-47a7-4a73-b458-909a68510281" (UID: "4ba2639b-47a7-4a73-b458-909a68510281"). InnerVolumeSpecName "kube-api-access-vbbsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.627696 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ba2639b-47a7-4a73-b458-909a68510281-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.627735 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbbsq\" (UniqueName: \"kubernetes.io/projected/4ba2639b-47a7-4a73-b458-909a68510281-kube-api-access-vbbsq\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.627750 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ba2639b-47a7-4a73-b458-909a68510281-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.950939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" event={"ID":"4ba2639b-47a7-4a73-b458-909a68510281","Type":"ContainerDied","Data":"e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9"} Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.951322 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1380468181a1209bc58615794dc6f128465cf946459ee031511637a5bfccbb9" Mar 08 07:15:03 crc kubenswrapper[4717]: I0308 07:15:03.951065 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549235-l9v9t" Mar 08 07:15:04 crc kubenswrapper[4717]: I0308 07:15:04.443176 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg"] Mar 08 07:15:04 crc kubenswrapper[4717]: I0308 07:15:04.451117 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549190-rg7zg"] Mar 08 07:15:05 crc kubenswrapper[4717]: I0308 07:15:05.792947 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d25fd984-997f-4aa9-8d01-a6acd96b1841" path="/var/lib/kubelet/pods/d25fd984-997f-4aa9-8d01-a6acd96b1841/volumes" Mar 08 07:15:11 crc kubenswrapper[4717]: I0308 07:15:11.782976 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:15:11 crc kubenswrapper[4717]: E0308 07:15:11.785429 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:15:23 crc kubenswrapper[4717]: I0308 07:15:23.794682 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:15:23 crc kubenswrapper[4717]: E0308 07:15:23.795777 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:15:35 crc kubenswrapper[4717]: I0308 07:15:35.782510 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:15:35 crc kubenswrapper[4717]: E0308 07:15:35.783565 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.747841 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:38 crc kubenswrapper[4717]: E0308 07:15:38.748856 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba2639b-47a7-4a73-b458-909a68510281" containerName="collect-profiles" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.748877 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba2639b-47a7-4a73-b458-909a68510281" containerName="collect-profiles" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.749290 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba2639b-47a7-4a73-b458-909a68510281" containerName="collect-profiles" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.751795 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.786791 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.850560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.850706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.850755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbfx7\" (UniqueName: \"kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.952153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.952254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbfx7\" (UniqueName: \"kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.952373 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.954035 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:38 crc kubenswrapper[4717]: I0308 07:15:38.954845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:39 crc kubenswrapper[4717]: I0308 07:15:39.003524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbfx7\" (UniqueName: \"kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7\") pod \"redhat-marketplace-nq6qs\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:39 crc kubenswrapper[4717]: I0308 07:15:39.085806 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:39 crc kubenswrapper[4717]: I0308 07:15:39.557008 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:40 crc kubenswrapper[4717]: I0308 07:15:40.403131 4717 generic.go:334] "Generic (PLEG): container finished" podID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerID="51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2" exitCode=0 Mar 08 07:15:40 crc kubenswrapper[4717]: I0308 07:15:40.403204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerDied","Data":"51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2"} Mar 08 07:15:40 crc kubenswrapper[4717]: I0308 07:15:40.403543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerStarted","Data":"9bff2b09335cc24fde1495bc7f92ae7a77157d27106c05818e7138c495252f73"} Mar 08 07:15:40 crc kubenswrapper[4717]: I0308 07:15:40.407618 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 07:15:41 crc kubenswrapper[4717]: I0308 07:15:41.416779 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerStarted","Data":"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d"} Mar 08 07:15:42 crc kubenswrapper[4717]: I0308 07:15:42.436796 4717 generic.go:334] "Generic (PLEG): container finished" podID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerID="9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d" exitCode=0 Mar 08 07:15:42 crc kubenswrapper[4717]: I0308 07:15:42.436859 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerDied","Data":"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d"} Mar 08 07:15:43 crc kubenswrapper[4717]: I0308 07:15:43.450572 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerStarted","Data":"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55"} Mar 08 07:15:43 crc kubenswrapper[4717]: I0308 07:15:43.471443 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nq6qs" podStartSLOduration=2.969635685 podStartE2EDuration="5.471419312s" podCreationTimestamp="2026-03-08 07:15:38 +0000 UTC" firstStartedPulling="2026-03-08 07:15:40.407174435 +0000 UTC m=+6567.324823319" lastFinishedPulling="2026-03-08 07:15:42.908958082 +0000 UTC m=+6569.826606946" observedRunningTime="2026-03-08 07:15:43.470170271 +0000 UTC m=+6570.387819115" watchObservedRunningTime="2026-03-08 07:15:43.471419312 +0000 UTC m=+6570.389068186" Mar 08 07:15:43 crc kubenswrapper[4717]: I0308 07:15:43.586674 4717 scope.go:117] "RemoveContainer" containerID="c1577b0f49da40debfe5e950da589752093e423f4e29f292c1f1e867d2aa0aae" Mar 08 07:15:46 crc kubenswrapper[4717]: I0308 07:15:46.782463 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:15:46 crc kubenswrapper[4717]: E0308 07:15:46.783303 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:15:49 crc kubenswrapper[4717]: I0308 07:15:49.086950 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:49 crc kubenswrapper[4717]: I0308 07:15:49.087497 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:49 crc kubenswrapper[4717]: I0308 07:15:49.149837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:49 crc kubenswrapper[4717]: I0308 07:15:49.593068 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:49 crc kubenswrapper[4717]: I0308 07:15:49.668869 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:51 crc kubenswrapper[4717]: I0308 07:15:51.560894 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nq6qs" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="registry-server" containerID="cri-o://4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55" gracePeriod=2 Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.304950 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.392904 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbfx7\" (UniqueName: \"kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7\") pod \"9f62d59f-606b-4ff8-8e7f-f057680d137b\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.393046 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities\") pod \"9f62d59f-606b-4ff8-8e7f-f057680d137b\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.393101 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content\") pod \"9f62d59f-606b-4ff8-8e7f-f057680d137b\" (UID: \"9f62d59f-606b-4ff8-8e7f-f057680d137b\") " Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.394200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities" (OuterVolumeSpecName: "utilities") pod "9f62d59f-606b-4ff8-8e7f-f057680d137b" (UID: "9f62d59f-606b-4ff8-8e7f-f057680d137b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.400431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7" (OuterVolumeSpecName: "kube-api-access-nbfx7") pod "9f62d59f-606b-4ff8-8e7f-f057680d137b" (UID: "9f62d59f-606b-4ff8-8e7f-f057680d137b"). InnerVolumeSpecName "kube-api-access-nbfx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.443873 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f62d59f-606b-4ff8-8e7f-f057680d137b" (UID: "9f62d59f-606b-4ff8-8e7f-f057680d137b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.495820 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbfx7\" (UniqueName: \"kubernetes.io/projected/9f62d59f-606b-4ff8-8e7f-f057680d137b-kube-api-access-nbfx7\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.495888 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.495906 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f62d59f-606b-4ff8-8e7f-f057680d137b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.579399 4717 generic.go:334] "Generic (PLEG): container finished" podID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerID="4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55" exitCode=0 Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.579468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerDied","Data":"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55"} Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.579514 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nq6qs" event={"ID":"9f62d59f-606b-4ff8-8e7f-f057680d137b","Type":"ContainerDied","Data":"9bff2b09335cc24fde1495bc7f92ae7a77157d27106c05818e7138c495252f73"} Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.579531 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nq6qs" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.579555 4717 scope.go:117] "RemoveContainer" containerID="4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.611464 4717 scope.go:117] "RemoveContainer" containerID="9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.634605 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.644922 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nq6qs"] Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.653515 4717 scope.go:117] "RemoveContainer" containerID="51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.688950 4717 scope.go:117] "RemoveContainer" containerID="4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55" Mar 08 07:15:52 crc kubenswrapper[4717]: E0308 07:15:52.689342 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55\": container with ID starting with 4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55 not found: ID does not exist" containerID="4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.689383 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55"} err="failed to get container status \"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55\": rpc error: code = NotFound desc = could not find container \"4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55\": container with ID starting with 4483c48b5b0d2f082de15285f2af68790dca94351f4c095e4f013f109e52ff55 not found: ID does not exist" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.689408 4717 scope.go:117] "RemoveContainer" containerID="9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d" Mar 08 07:15:52 crc kubenswrapper[4717]: E0308 07:15:52.689668 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d\": container with ID starting with 9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d not found: ID does not exist" containerID="9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.689753 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d"} err="failed to get container status \"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d\": rpc error: code = NotFound desc = could not find container \"9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d\": container with ID starting with 9166ddcf9e054f4b159151f597a596693438c33282d4f404d55368d3f7e65a5d not found: ID does not exist" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.689772 4717 scope.go:117] "RemoveContainer" containerID="51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2" Mar 08 07:15:52 crc kubenswrapper[4717]: E0308 07:15:52.689989 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2\": container with ID starting with 51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2 not found: ID does not exist" containerID="51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2" Mar 08 07:15:52 crc kubenswrapper[4717]: I0308 07:15:52.690018 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2"} err="failed to get container status \"51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2\": rpc error: code = NotFound desc = could not find container \"51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2\": container with ID starting with 51cdd51562f5f2365beb05581cd7c94c929acd5df946e4ff6af0c148bedfada2 not found: ID does not exist" Mar 08 07:15:53 crc kubenswrapper[4717]: I0308 07:15:53.796935 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" path="/var/lib/kubelet/pods/9f62d59f-606b-4ff8-8e7f-f057680d137b/volumes" Mar 08 07:15:57 crc kubenswrapper[4717]: I0308 07:15:57.783046 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:15:57 crc kubenswrapper[4717]: E0308 07:15:57.783547 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.158847 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549236-fs889"] Mar 08 07:16:00 crc kubenswrapper[4717]: E0308 07:16:00.161124 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="registry-server" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.161307 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="registry-server" Mar 08 07:16:00 crc kubenswrapper[4717]: E0308 07:16:00.161471 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="extract-content" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.161603 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="extract-content" Mar 08 07:16:00 crc kubenswrapper[4717]: E0308 07:16:00.161809 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="extract-utilities" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.161946 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="extract-utilities" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.162435 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f62d59f-606b-4ff8-8e7f-f057680d137b" containerName="registry-server" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.163859 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.166415 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.168732 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.169048 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.174922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549236-fs889"] Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.267190 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7b9\" (UniqueName: \"kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9\") pod \"auto-csr-approver-29549236-fs889\" (UID: \"2aa37872-821f-4207-9338-a36a56d4858e\") " pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.369468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7b9\" (UniqueName: \"kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9\") pod \"auto-csr-approver-29549236-fs889\" (UID: \"2aa37872-821f-4207-9338-a36a56d4858e\") " pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.388087 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7b9\" (UniqueName: \"kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9\") pod \"auto-csr-approver-29549236-fs889\" (UID: \"2aa37872-821f-4207-9338-a36a56d4858e\") " pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:00 crc kubenswrapper[4717]: I0308 07:16:00.517114 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:01 crc kubenswrapper[4717]: I0308 07:16:01.025218 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549236-fs889"] Mar 08 07:16:01 crc kubenswrapper[4717]: I0308 07:16:01.676817 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549236-fs889" event={"ID":"2aa37872-821f-4207-9338-a36a56d4858e","Type":"ContainerStarted","Data":"1f5915b8fd2be32a2c3cb95857fa426b5f0b047d33f6c1876979f5aa25991753"} Mar 08 07:16:03 crc kubenswrapper[4717]: I0308 07:16:03.699182 4717 generic.go:334] "Generic (PLEG): container finished" podID="2aa37872-821f-4207-9338-a36a56d4858e" containerID="f750b7ad52daefb4ca45c63d82c8201b438156842ed6ee1d4695ebee6cdbb147" exitCode=0 Mar 08 07:16:03 crc kubenswrapper[4717]: I0308 07:16:03.699286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549236-fs889" event={"ID":"2aa37872-821f-4207-9338-a36a56d4858e","Type":"ContainerDied","Data":"f750b7ad52daefb4ca45c63d82c8201b438156842ed6ee1d4695ebee6cdbb147"} Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.088726 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.183846 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7b9\" (UniqueName: \"kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9\") pod \"2aa37872-821f-4207-9338-a36a56d4858e\" (UID: \"2aa37872-821f-4207-9338-a36a56d4858e\") " Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.189619 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9" (OuterVolumeSpecName: "kube-api-access-hg7b9") pod "2aa37872-821f-4207-9338-a36a56d4858e" (UID: "2aa37872-821f-4207-9338-a36a56d4858e"). InnerVolumeSpecName "kube-api-access-hg7b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.286803 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7b9\" (UniqueName: \"kubernetes.io/projected/2aa37872-821f-4207-9338-a36a56d4858e-kube-api-access-hg7b9\") on node \"crc\" DevicePath \"\"" Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.733340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549236-fs889" event={"ID":"2aa37872-821f-4207-9338-a36a56d4858e","Type":"ContainerDied","Data":"1f5915b8fd2be32a2c3cb95857fa426b5f0b047d33f6c1876979f5aa25991753"} Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.733386 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5915b8fd2be32a2c3cb95857fa426b5f0b047d33f6c1876979f5aa25991753" Mar 08 07:16:05 crc kubenswrapper[4717]: I0308 07:16:05.733459 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549236-fs889" Mar 08 07:16:06 crc kubenswrapper[4717]: I0308 07:16:06.169775 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549230-jsb68"] Mar 08 07:16:06 crc kubenswrapper[4717]: I0308 07:16:06.190289 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549230-jsb68"] Mar 08 07:16:07 crc kubenswrapper[4717]: I0308 07:16:07.802849 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578d3092-82b0-4cb5-b1d0-8f34a4bdc871" path="/var/lib/kubelet/pods/578d3092-82b0-4cb5-b1d0-8f34a4bdc871/volumes" Mar 08 07:16:12 crc kubenswrapper[4717]: I0308 07:16:12.781435 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:16:12 crc kubenswrapper[4717]: E0308 07:16:12.782594 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:16:23 crc kubenswrapper[4717]: I0308 07:16:23.803047 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:16:23 crc kubenswrapper[4717]: E0308 07:16:23.804223 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:16:35 crc kubenswrapper[4717]: I0308 07:16:35.782152 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:16:35 crc kubenswrapper[4717]: E0308 07:16:35.783237 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:16:43 crc kubenswrapper[4717]: I0308 07:16:43.654335 4717 scope.go:117] "RemoveContainer" containerID="438ee82ba6576f50ea145e22e1fc297f271c54fb76d518e92640d6ada2fc1761" Mar 08 07:16:47 crc kubenswrapper[4717]: I0308 07:16:47.781832 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:16:47 crc kubenswrapper[4717]: E0308 07:16:47.782660 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:17:02 crc kubenswrapper[4717]: I0308 07:17:02.782466 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:17:02 crc kubenswrapper[4717]: E0308 07:17:02.783310 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:17:16 crc kubenswrapper[4717]: I0308 07:17:16.782699 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:17:16 crc kubenswrapper[4717]: E0308 07:17:16.783538 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:17:25 crc kubenswrapper[4717]: E0308 07:17:25.098215 4717 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.318s" Mar 08 07:17:29 crc kubenswrapper[4717]: I0308 07:17:29.783867 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:17:29 crc kubenswrapper[4717]: E0308 07:17:29.784843 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:17:44 crc kubenswrapper[4717]: I0308 07:17:44.781901 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:17:44 crc kubenswrapper[4717]: E0308 07:17:44.782858 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:17:55 crc kubenswrapper[4717]: I0308 07:17:55.782334 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:17:55 crc kubenswrapper[4717]: E0308 07:17:55.783240 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.166194 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549238-vfrzh"] Mar 08 07:18:00 crc kubenswrapper[4717]: E0308 07:18:00.167313 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa37872-821f-4207-9338-a36a56d4858e" containerName="oc" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.167332 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa37872-821f-4207-9338-a36a56d4858e" containerName="oc" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.167584 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa37872-821f-4207-9338-a36a56d4858e" containerName="oc" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.168407 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.170937 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.171224 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-w8tpm" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.171768 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.180163 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549238-vfrzh"] Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.256545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hp8\" (UniqueName: \"kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8\") pod \"auto-csr-approver-29549238-vfrzh\" (UID: \"f3092d70-71c2-484e-841e-820fcf53c24b\") " pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.359200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hp8\" (UniqueName: \"kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8\") pod \"auto-csr-approver-29549238-vfrzh\" (UID: \"f3092d70-71c2-484e-841e-820fcf53c24b\") " pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.392679 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hp8\" (UniqueName: \"kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8\") pod \"auto-csr-approver-29549238-vfrzh\" (UID: \"f3092d70-71c2-484e-841e-820fcf53c24b\") " pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.490708 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:00 crc kubenswrapper[4717]: I0308 07:18:00.941245 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549238-vfrzh"] Mar 08 07:18:00 crc kubenswrapper[4717]: W0308 07:18:00.951072 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3092d70_71c2_484e_841e_820fcf53c24b.slice/crio-e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf WatchSource:0}: Error finding container e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf: Status 404 returned error can't find the container with id e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf Mar 08 07:18:01 crc kubenswrapper[4717]: I0308 07:18:01.525657 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" event={"ID":"f3092d70-71c2-484e-841e-820fcf53c24b","Type":"ContainerStarted","Data":"e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf"} Mar 08 07:18:02 crc kubenswrapper[4717]: I0308 07:18:02.535063 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3092d70-71c2-484e-841e-820fcf53c24b" containerID="feb3df91d22fb3c4f47fd1ad45f9b2ba3a2ed48295095b88fe5d548d675e8f57" exitCode=0 Mar 08 07:18:02 crc kubenswrapper[4717]: I0308 07:18:02.535213 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" event={"ID":"f3092d70-71c2-484e-841e-820fcf53c24b","Type":"ContainerDied","Data":"feb3df91d22fb3c4f47fd1ad45f9b2ba3a2ed48295095b88fe5d548d675e8f57"} Mar 08 07:18:03 crc kubenswrapper[4717]: I0308 07:18:03.888667 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:03 crc kubenswrapper[4717]: I0308 07:18:03.932710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hp8\" (UniqueName: \"kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8\") pod \"f3092d70-71c2-484e-841e-820fcf53c24b\" (UID: \"f3092d70-71c2-484e-841e-820fcf53c24b\") " Mar 08 07:18:03 crc kubenswrapper[4717]: I0308 07:18:03.938345 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8" (OuterVolumeSpecName: "kube-api-access-z7hp8") pod "f3092d70-71c2-484e-841e-820fcf53c24b" (UID: "f3092d70-71c2-484e-841e-820fcf53c24b"). InnerVolumeSpecName "kube-api-access-z7hp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.034760 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hp8\" (UniqueName: \"kubernetes.io/projected/f3092d70-71c2-484e-841e-820fcf53c24b-kube-api-access-z7hp8\") on node \"crc\" DevicePath \"\"" Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.564673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" event={"ID":"f3092d70-71c2-484e-841e-820fcf53c24b","Type":"ContainerDied","Data":"e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf"} Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.564765 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73d63ecb82eba496d09a9665aabfa8f8f6a28733962dcc0a12cbd142d02b4bf" Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.564764 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549238-vfrzh" Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.979604 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549232-gw7st"] Mar 08 07:18:04 crc kubenswrapper[4717]: I0308 07:18:04.996327 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549232-gw7st"] Mar 08 07:18:05 crc kubenswrapper[4717]: I0308 07:18:05.791882 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7476f9a4-fc17-4134-abda-275e7abd1f43" path="/var/lib/kubelet/pods/7476f9a4-fc17-4134-abda-275e7abd1f43/volumes" Mar 08 07:18:09 crc kubenswrapper[4717]: I0308 07:18:09.782540 4717 scope.go:117] "RemoveContainer" containerID="8dd3ed1c3b3cd63bebfc7c1b2b9c3b213e2c1f530007096d2ff32a5162b2ad17" Mar 08 07:18:09 crc kubenswrapper[4717]: E0308 07:18:09.783205 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tb7pf_openshift-machine-config-operator(7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e)\"" pod="openshift-machine-config-operator/machine-config-daemon-tb7pf" podUID="7cc2722f-d1ec-4b4e-a3e2-91f78b440a8e"